Overclock.net banner

[DSOG] Former Ubisoft dev says that Techland intentionally made AMD FSR look worse in Dying Light 2

4.7K views 26 replies 8 participants last post by  Milamber  
#1 · (Edited)
As the dev said:
"This game is an NVIDIA sponsored title so FSR is missing the ultra quality preset along with having the sharpness value lowest as it can be to make the technology look bad.”
So, as you may have guessed, we benchmarked the new

Below you can find some comparison screenshots. AMD FSR Quality is on the left, NVIDIA DLSS is in the middle, and AMD FSR Ultra Quality is on the right.

As we can easily see, AMD FSR Ultra Quality looks sharper and better than NVIDIA DLSS Quality. However, AMD FSR Ultra Quality also runs noticeably slower than DLSS Ultra Quality.

This was tested on a 3080. We don't know what the results are on a 6800xt/6900xt that has SAM enabled by default.
 
#2 ·
Interesting. The fact that they did not include the ultra quality mode for FSR definitely does seem intentional. Rather than reading someone else's interpretation of what this Ubi dev is saying, might as well read his comments directly here: Get Better Looking FSR In Dyling Light 2 : dyinglight (reddit.com)

I watched the Digital Foundry video on Dying Light 2's PC settings and when they compared FSR to DLSS, I did think the FSR presentation looked worse than what I had seen with my own eyes before. Now I know why.
 
  • Rep+
Reactions: Insidious Supra
#3 · (Edited)
Interesting thanks....
You really think fsr would be in the game if Nvidia wanted to censor it/make it look bad?
Yes I do. I am a game developer (Former Ubisoft) and I have also worked with FSR's open source code before. Everything wrong with FSR was deliberately and manually changed, because the code is modified.
Furthermore NVIDIA had a presentation where they compared FSR against DLSS and in that they used quality vs quality so they did it by internal resolution and not performance (which is disingenuous) and it also looked like the sharpness was neutered there too. So the same exact thing NVIDIA demonstrated is happening in a game they sponsor. Maybe it's a coincidence but this is still more than forgetfulness to alter the code of something in an unfavorable way.
Let's not forget their using an RT algorithm made for tensor cores for certain ray tracing effects that they didn't disable for AMD cards which tanks performance drastically than what it should be but again as a former dev at Ubisoft who partners with AMD I know very well the practices of intentionally limiting a competitors product (card or features) to make our sponsor look better so this is not some conspiracy nor is it specific to NVIDIA.
The reason we haven't really seen this specific instance before is because theirs been a lack of open source features, now that we have one and the agreement is not strict it's possible. Also yes DLSS is superior but FSR looks much better than how it acts here.
And there is already a sharpness slider in game right?
I'm unaware if the sharpness slider is RCAS or separate and also even having it on max it's still blurry, the max changes the value is to 1.0 I believe and FSR can go up way more than that.
Using a program to inject FSR like Lossless Scaling produces a better looking image than the in game one despite no LOD bias adjustment and happening after all post processing to give you an idea of how awful it is. No one can be certain this is because of the sponsorship but that's not the point of the post, the point is you can mitigate a lot of these limitations by tweaking the config and I'm showing that here to help people, this was just a brief theory as to why (one backed by experience & common sense) it still is just my belief you don't have to agree, I just know what was in our contractual obligations when partnering with AMD (can't discuss) and I really don't think NVIDIA is any different.
 
#4 ·
Nothing new here shady is the new normal. You would think if one brand has the better tech they wouldn't need to fiddle with a competitors tech.

Imagine how this stuff is going to go down when you have a 3rd brand who's been known to be the master of subterfuge... Enters the market...
 
#6 ·
I can tell you that both is the wrong statement here.
 
  • Haha
Reactions: LtMatt
#8 ·
"1. For GTX or AMD users enable DRS or VRS in your GPUs software, then select a resolution higher than your monitors resolution (like 1440p while at 1080p, or 1800p while at 1440p) then select the quality FSR preset in game. This is because the game doesn't support the Ultra Quality setting and this makes it so the internal resolution is higher but you still gain positive FPS. "

 
#12 ·
I still can't believe AMD doesn't have real ray-tracing or AI cores in their GPUs yet. Totally caught with their pants down by nVidia back in 2018.
 
#14 ·
But the RT performance is bad compared to nVidia and I don't think RDNA2 even does accelerate all the portions of the process as nVidia's RT cores... "Tensor" cores for AI scaling... still totally absent.
 
#15 ·
Stop talking jibberish. Nothing you said made any sense. RT isn't wholly adapted as a standard. Just a footnote of an option you can enable in games. It's not and will not be a standard option in games. IE: The only way to play the game is to have RT capable hardware. Rasterization is the king standard in how games are developed by default.
 
#16 ·
OK... But DLSS is still a huge bonus compared to other scaling methods.
 
#17 · (Edited)
No, it is not a bonus. It is the very opposite. It is segregated putting it at a disadvantage. It's only limited to those with a particular gpu. And, it's only designed to promote that particular gpu. Without that GPU it hold no real value.

It's not something shared on console. Not something shared with ever other gpu. And with Intel entering the gou market with its own upscaler it's hard to justify it as you see it.

Ultimately, Ms will intervene and create some sort of upscaler for all to use. Call it dx12.X. Call it dx13. It doesn't matter. At the end of the day it never pays to look like you are cheerleading a particular company like that. They only care for what's in your wallet and what they can get from you. Not you as a person.
:coffee:
 
#20 ·
No, you don’t NEED their hardware, but in terms of competing, RDNA2 totally lacks the hardware for AI scaling and the ray-tracing performance is like half that of nVidia.
You didn’t NEED an Intel CPU because the FX-8350 COULD run games too…

Its not about just being the best or worst, it’s about the performance and feature gap. If that gap is too big as it is with RDNA2 in terms of those features or Bulldozer CPU’s single thread IPS performance, it’s a detriment.
 
#21 ·
Saying something just to say something? I hear you banging the barrel but not much communication can come from it. Like I made perfectly clear in my prior post. Your love of how RTX does RT is nothing more then a farce as to how RT is really done. RT is used to make "life like" creations not mimic rasterization. That's the key to what I am showing you here.

Therefore, if you want to believe that your "tensor cores" is faster at RT Lite, rasterized gaming then I will simply take rasterized gaming instead. Therefore, find no value for tensor cores. Because it's highly inefficient in creating real RT effects. And, doesn't add the immersion that I seek in order to identify RT in games as any other then fake rasterization.
:coffee:


Now FSR on the other hand for lower end gpus...
 
#22 ·
You're not saying anything other than hardware ray-tracing and AI scaling with tensor cores are useless. The entire video game industry has the opposite view. It's not RTX specifically. Even in non-RTX specific titles, nVidia cards are way faster with ray-tracing. It's a hardware deficit on AMD's end. FSR will always look worse than current DLSS. It's just a technology difference because AMD doesn't have the hardware to compete with it. Turing caught AMD by surprise. It's as simple as that. The only thing saving them right now is that implementing ray-tracing in games is still very new, so traditional techniques are still prioritized.
 
#23 ·
LOL
At least you unknowingly admitted that what you call RT in game isn't really anything revolutionary or life like. If you want to bot your responses by saying this go right ahead. But what you fancy as a different way to render an image is still fake RT. It's 3rd rated attempt of doing something different other then rasterization. And, provides no real benefit other then something "new" to do.
-------------------
Now, back at the ranch we need some real UQ FSR support so that players who older hardware can get a nice bump in performance with a increase in IQ.
 
#24 ·
It is revolutionary. It's been shown in Metro Exodus and other titles that RT features are hands down better. Particularly ray-traced global illumination.