You really think fsr would be in the game if Nvidia wanted to censor it/make it look bad?
Yes I do.
I am a game developer (Former Ubisoft) and I have also worked with FSR's open source code before.
Everything wrong with FSR was deliberately and manually changed, because the code is modified.
Furthermore
NVIDIA had a presentation where they compared FSR against DLSS and in that they used quality vs quality so they
did it by internal resolution and not performance (which is disingenuous) and it also looked like the sharpness was neutered there too. So
the same exact thing NVIDIA demonstrated is happening in a game they sponsor. Maybe it's a coincidence but this is still more than forgetfulness to alter the code of something in an unfavorable way.
Let's not forget their
using an RT algorithm made for tensor cores for certain ray tracing effects that
they didn't disable for AMD cards which tanks performance drastically than what it should be but again as a former dev at Ubisoft who partners with AMD I know very well the practices of intentionally limiting a competitors product (card or features) to make our sponsor look better so this is not some conspiracy nor is it specific to NVIDIA.
The reason we haven't really seen this specific instance before is because theirs been a lack of open source features, now that we have one and the agreement is not strict it's possible. Also yes DLSS is superior but FSR looks much better than how it acts here.
And there is already a sharpness slider in game right?
I'm unaware if the sharpness slider is RCAS or separate and also even having it on max it's still blurry, the max changes the value is to 1.0 I believe and FSR can go up way more than that.
Using a program to inject FSR like Lossless Scaling produces a better looking image than the in game one despite no LOD bias adjustment and happening after all post processing to give you an idea of how awful it is. No one can be certain this is because of the sponsorship but that's not the point of the post, the point is you can mitigate a lot of these limitations by tweaking the config and I'm showing that here to help people, this was just a brief theory as to why (one backed by experience & common sense) it still is just my belief you don't have to agree, I just know what was in our contractual obligations when partnering with AMD (can't discuss) and I really don't think NVIDIA is any different.