"I think it's obvious", no.... it's not obvious. Every word you wrote that follows that "I think it's obvious" was a conjecture. You lack the information, started with an assumption and every sentence that followed was building upon that initial assumption that is without evidence.
The raytracing used was DXR not RTX. DXR differs from RTX in that RT Cores handle both Geometry and Ray Casting Acceleration tasks whereas AMD Ray Accelerators only accelerate Ray Casting while Geometry is handled by the RDNA2 Geometry Processor. Because of this, less rays can be cast as the bottle neck is the Geometry Processor.
Less rays means less accuracy and DXR is capable, as it stands, of less accuracy than RTX.
So Ubisoft would have had to spend the developer time and money for RTX so that a few RTX owners could enjoy their RTX raytracing whereas with DXR it works on the Consoles, RTX hardware, and AMD's RDNA2 hardware. Seems like a no brainer to me.
You seem surprised that something after the words "I think" is conjecture. I never said "I know for a fact" or "I have imperical evidence" or even "I can make a good case"
I (me) think (not know) it's obvious (using my internal logic)
I chose my words carefully and said what I meant.
When you say "so that a few RTX owners could enjoy ...." you seem salty that your favorite brand has a very underwhelming version of a feature implemented.
Ubisoft "spent developer time and money" on Watchdogs Legion to have RTX and DLSS, on what I would assume is a less popular (and therefore less likely to make profit) franchise. And that has RT on both console and PC. So given as your logic falls apart under the slightest scrutiny, therefore:
I think it's obvious that on the AMD sponsored title they were commanded from on high by AMD to not include good RTX features, so their inferior hardware looks better up against their competitions. What company would keep paying to sponsor a title if they knew features were being implemented that would give their only direct competitor a massive advantage in day 1 reviews and general internet buzz?
And why would HD textures randomly take up EXACTLY 11g of vram? Alex in Digital Foundry's FC6 video found that even though the HD textures were using less than his 3080s 10G buffer the textures were all messed up with the HD pack on. Following your logic, the new consoles have 8g VRAM buffers and also have the HD texture pack and run at 4k Resolution! So tell me if its because of console reasons like you claim, why isn't there a 8g requirement? Why does the PC have this arbitrary 11g requirement for HD textures? and just coincidentally the requirement is 1 unit more than the 3080? And coincidentally 1 unit less than the RX 6700 and up has? You are telling me that it just HAPPENED to line up EXACTLY between NVIDIAS's mainstream flagship gpu and AMD's gpus? No.Therefore:
I think it's obvious that on the AMD sponsored title, there is an arbitrary 11g Vram requirement, when the consoles HD texture pack runs fine at 4k with a 8GB Vram buffer, because they were TOLD BY AMD to add that perfectly magic number.
Why do AMD fanboys think that the only corporations that can do dodgy, shady, underhanded, sneaky things is everyone else but AMD? All of these massive corporations have gotten to where they are by climbing the mountains of bodies of the people they have used and exploited, the smaller companies they have ravaged and dismantled, etc etc. But I digress...