Overclock.net - An Overclocking Community - View Single Post - [Techpowerup] NVIDIA DLSS and its Surprising Resolution Limitations

View Single Post
post #22 of (permalink) Old 02-15-2019, 12:22 AM
Graphics Junkie
UltraMega's Avatar
Join Date: Feb 2017
Location: USA
Posts: 731
Rep: 13 (Unique: 13)
Quote: Originally Posted by ILoveHighDPI View Post
I would generally agree with this sentiment at resolutions around 4K or below, but it's important to note that as pixel density increases the need for exact rendering is lowered.
Unfortunately it's not a simple subject and the ideal display resolution cannot be described in linear resolution terms.
"Upscaling" is generally pretty bad in terms of the impact on sharpness (on this point it appears that both TAA and DLSS have similar results), but "Mixed Resolution" rendering should be the end goal.

Here is the problem: www.michaelbach.de/ot/lum-hyperacuity/index.html
Your eyes have different sensitivities to different kinds of image patterns.

Ideally we would all be using 8K screens, but not everything in a given frame should be rendered at native resolution.
If we could just get MSAA to work on Textures then we would already have the ideal solution, but as far as I know that's pretty much an impossibility.
My best bet is "Checkerboarding". Not necessarily exactly as implemented today, but in theory if you alternate your rendering pattern on each frame then Two Checkerboards+TAA=Native Resolution. Especially at 120hz a good Checkerboard 8K implementation should be practically impossible to notice.
Even if you don't get pixel perfect rendering accuracy on each frame, you still avoid jagged breaks in line patterns.

I mentioned this in another thread, that Watch Dogs 2 has an option it calls Temporal Filtering which is basically an Ubisoft version of checkerboarding mixed with some AA, and it works extremely well in that game. I know personally I could not tell the difference visually, only that my FPS was a lot higher. Given how good the implimentation worked in that game, I woudn't be surprised if Ubisoft is now using it by default to some extent in some if the newer titles, because I don't know why they wouldn't at least have it as an option in newer games unless they believe their implimentation of it is so good that it should be used all the time now. If that is the case, I would say they're logic isn't wrong at all. If you can only tell their games are using upscaling by doing an indepth frame comparison then why not use it all the time?

Side note, It would have been great if they had that feature for Wildlands when that games released because that game basically looks just as good as some of their newer titles but it's still a very demanding game on any PC, and when it released I think a lot of people avoided it because of how demanding it was.

i7 7700kK @4.2ghz
16GB DDR4 3200mhz
GeForce 1080 Ti
UltraMega is offline