[Techpowerup] NVIDIA DLSS and its Surprising Resolution Limitations - Page 3 - Overclock.net - An Overclocking Community

Forum Jump: 

[Techpowerup] NVIDIA DLSS and its Surprising Resolution Limitations

Reply
 
Thread Tools
post #21 of 91 (permalink) Old 02-15-2019, 12:09 AM - Thread Starter
New to Overclock.net
 
ILoveHighDPI's Avatar
 
Join Date: Oct 2011
Posts: 3,284
Rep: 133 (Unique: 84)
Quote: Originally Posted by The Robot View Post
Nvidia is pushing console peasantry into PC space. Upscaling is blasphemy.
I would generally agree with this sentiment at resolutions around 4K or below, but it's important to note that as pixel density increases the need for exact rendering is lowered.
Unfortunately it's not a simple subject and the ideal display resolution cannot be described in linear resolution terms.
"Upscaling" is generally pretty bad in terms of the impact on sharpness (on this point it appears that both TAA and DLSS have similar results), but "Mixed Resolution" rendering should be the end goal.

Here is the problem: www.michaelbach.de/ot/lum-hyperacuity/index.html
Your eyes have different sensitivities to different kinds of image patterns.

Ideally we would all be using 8K screens, but not everything in a given frame should be rendered at native resolution.
If we could just get MSAA to work on Textures then we would already have the ideal solution, but as far as I know that's pretty much an impossibility.
My best bet is "Checkerboarding". Not necessarily exactly as implemented today, but in theory if you alternate your rendering pattern on each frame then Two Checkerboards+TAA=Native Resolution. Especially at 120hz a good Checkerboard 8K implementation should be practically impossible to notice.
Even if you don't get pixel perfect rendering accuracy on each frame, you still avoid jagged breaks in line patterns.
ILoveHighDPI is offline  
Sponsored Links
Advertisement
 
post #22 of 91 (permalink) Old 02-15-2019, 12:22 AM
Graphics Junkie
 
UltraMega's Avatar
 
Join Date: Feb 2017
Location: USA
Posts: 1,293
Rep: 30 (Unique: 26)
Quote: Originally Posted by ILoveHighDPI View Post
I would generally agree with this sentiment at resolutions around 4K or below, but it's important to note that as pixel density increases the need for exact rendering is lowered.
Unfortunately it's not a simple subject and the ideal display resolution cannot be described in linear resolution terms.
"Upscaling" is generally pretty bad in terms of the impact on sharpness (on this point it appears that both TAA and DLSS have similar results), but "Mixed Resolution" rendering should be the end goal.

Here is the problem: www.michaelbach.de/ot/lum-hyperacuity/index.html
Your eyes have different sensitivities to different kinds of image patterns.

Ideally we would all be using 8K screens, but not everything in a given frame should be rendered at native resolution.
If we could just get MSAA to work on Textures then we would already have the ideal solution, but as far as I know that's pretty much an impossibility.
My best bet is "Checkerboarding". Not necessarily exactly as implemented today, but in theory if you alternate your rendering pattern on each frame then Two Checkerboards+TAA=Native Resolution. Especially at 120hz a good Checkerboard 8K implementation should be practically impossible to notice.
Even if you don't get pixel perfect rendering accuracy on each frame, you still avoid jagged breaks in line patterns.

I mentioned this in another thread, that Watch Dogs 2 has an option it calls Temporal Filtering which is basically an Ubisoft version of checkerboarding mixed with some AA, and it works extremely well in that game. I know personally I could not tell the difference visually, only that my FPS was a lot higher. Given how good the implimentation worked in that game, I woudn't be surprised if Ubisoft is now using it by default to some extent in some if the newer titles, because I don't know why they wouldn't at least have it as an option in newer games unless they believe their implimentation of it is so good that it should be used all the time now. If that is the case, I would say they're logic isn't wrong at all. If you can only tell their games are using upscaling by doing an indepth frame comparison then why not use it all the time?

Side note, It would have been great if they had that feature for Wildlands when that games released because that game basically looks just as good as some of their newer titles but it's still a very demanding game on any PC, and when it released I think a lot of people avoided it because of how demanding it was.

i7 7700kK @4.2ghz
16GB DDR4 3200mhz
GeForce 1080 Ti
UltraMega is offline  
post #23 of 91 (permalink) Old 02-15-2019, 12:31 AM
professional curmudgeon
 
looniam's Avatar
 
Join Date: Apr 2009
Posts: 9,679
Rep: 791 (Unique: 451)
Quote: Originally Posted by huzzug View Post
He isn't wrong that entire shebang about DLSS, from the onset, was quoted to work through GFE and the details came from the horses mouth. Unless you have source who had confirmed to you while at launch of the RTX series that GFE wouldn't be a pre-requisite for DLSS while the media ran with the article claiming the contrary.
tried into/work through GFE does NOT mean exclusive to GFE.

the confirmation is the drivers themselves; despite the echo chamber of the internet.

Remember the golden rule of statistics: A personal sample size of one is a sufficient basis upon which to draw universal conclusions.
Upload the computer to Dropbox and provide a link to it so others may download it to examine and give advice for repairs.
loon 3.2
(18 items)
CPU
i7-3770K
Motherboard
Asus P8Z77-V Pro
GPU
EVGA 980TI SC+
RAM
16Gb PNY ddr3 1866
Hard Drive
PNY 1311 240Gb
Hard Drive
1 TB Seagate
Hard Drive
3 TB WD Blue
Optical Drive
DVD DVDRW+/-
Power Supply
EVGA SuperNova 750 G2
Cooling
EKWB P280 kit
Cooling
EK-VGA supremacy
Case
Stryker M [hammered and drilled]
Operating System
Win X
Monitor
LG 24MC57HQ-P
Keyboard
Ducky Zero [blues]
Mouse
corsair M65
Audio
SB Recon3D
Audio
Klipsch ProMedia 2.1
▲ hide details ▲


looniam is offline  
Sponsored Links
Advertisement
 
post #24 of 91 (permalink) Old 02-15-2019, 02:31 AM
New to Overclock.net
 
Hwgeek's Avatar
 
Join Date: Apr 2017
Posts: 570
Rep: 14 (Unique: 12)
OMG- Just Realized what DLSS stands for:
Details Lost - See Screenshots

Last edited by Hwgeek; 02-15-2019 at 02:37 AM.
Hwgeek is offline  
post #25 of 91 (permalink) Old 02-15-2019, 05:36 AM
Not new to overclock.net
 
guitarmageddon88's Avatar
 
Join Date: May 2010
Location: Probably racing....
Posts: 1,584
Rep: 70 (Unique: 62)
Repeat after me. GFE is not required for DLSS

Sandy-Capable
(14 items)
CPU
i7-8700k
Motherboard
ASUS Maximus X Code
GPU
MSI RTX 2080 Gaming Trio
RAM
GSkill Trident Z Rgb
Hard Drive
970 evo
Power Supply
EVGA Supernova G3
Cooling
H150i Pro
Case
LianLi PC011 Air
Operating System
Windows 10
Monitor
HP Omen 27"
Keyboard
Corsair K70
Mouse
Razer Abyssus
CPU
i7 2600k @ 4.5ghz
Motherboard
ASUS P8Z68 Deluxe Gen 3
GPU
MSI GTX 580 Lightning Xtreme
GPU
MSI GTX 580 Lightning Xtreme
RAM
Gskill Sniper 8gb 1866mHz @ 1.35v
Hard Drive
Samsung 840 pro
Optical Drive
Lite-On
Power Supply
Corsair hx 750
Cooling
rs 240
Case
OBSIDIAN 800D
Operating System
WINDOWS 7 64
Monitor
Alienware aw2310
Keyboard
corsair k70
Audio
Logitech Z906
▲ hide details ▲
guitarmageddon88 is offline  
post #26 of 91 (permalink) Old 02-15-2019, 05:53 AM
mfw
 
ToTheSun!'s Avatar
 
Join Date: Jul 2011
Location: Terra
Posts: 6,984
Rep: 391 (Unique: 203)
What tpi2007 is false because it's the opposite of what happens in reality, but he's not categorically wrong. He was led to believe, by statements from official documents, that GFE would be needed to manage DLSS data.

Now, that's exactly what nVidia wanted. Their wording, I bet, was intentionally ambiguous so as to incentivize people to download their software without committing to a solution that would further complicate the adoption of Turing's new features.

In that sense, let's all be friends.

CPU
Intel 6700K
Motherboard
Asus Z170i
GPU
MSI 2080 Sea Hawk X
RAM
G.skill Trident Z 3200CL14 8+8
Hard Drive
Samsung 850 EVO 1TB
Hard Drive
Crucial M4 256GB
Power Supply
Corsair SF600
Cooling
Noctua NH C14S
Case
Fractal Design Core 500
Operating System
Windows 10 Education
Monitor
ViewSonic XG2703-GS
Keyboard
Ducky One 2 Mini
Mouse
Glorious Odin
Mousepad
Asus Scabbard
Audio
Fiio E17K v1.0 + Beyerdynamic DT 1990 PRO (B pads)
▲ hide details ▲
ToTheSun! is offline  
post #27 of 91 (permalink) Old 02-15-2019, 06:09 AM
Overclocking Enthusiast
 
Silent Scone's Avatar
 
Join Date: Nov 2013
Posts: 11,365
Rep: 402 (Unique: 225)
Quote: Originally Posted by WannaBeOCer View Post
Those Tensor cores are there for creative developers not just DLSS. They finally have the hardware they just have to put it to use.
Yeah, to suggest tensor cores are a waste of time based purely on DLSS is short sighted.

Also, you most certainly do not need GFE in order to use DLSS.

[Source] I'm using it and do not have it installed lol.
Silent Scone is offline  
post #28 of 91 (permalink) Old 02-15-2019, 06:13 AM
New to Overclock.net
 
doom26464's Avatar
 
Join Date: Jun 2014
Posts: 721
Rep: 11 (Unique: 9)
Mighty fine beta testing cards the RTX cards are yes.
doom26464 is offline  
post #29 of 91 (permalink) Old 02-15-2019, 06:32 AM
Looking Ahead
 
TheBlademaster01's Avatar
 
Join Date: Dec 2008
Location: Cluain Dolcáin, Leinster (Ireland)
Posts: 13,043
Rep: 785 (Unique: 536)
Quote: Originally Posted by Silent Scone View Post
Yeah, to suggest tensor cores are a waste of time based purely on DLSS is short sighted.

Also, you most certainly do not need GFE in order to use DLSS.

[Source] I'm using it and do not have it installed lol.
That's because very few people know what this hardware actually is. If you don't know the matter or motivation behind it and only have marketing fluff as input, it's easy to make these kinds of mistakes.

 



TheBlademaster01 is offline  
post #30 of 91 (permalink) Old 02-15-2019, 07:06 AM
ORL
H20 only!
 
ORL's Avatar
 
Join Date: Sep 2011
Location: STL
Posts: 348
Rep: 18 (Unique: 17)
DLSS is in its infancy. Right now it will continue to be restricted to nVidia as a check point to generate revenue and also help with quality control as it is developed. This tech will eventually bleed out into the general market availability for developers, likely with purchasable learning systems. As this system evolves the quality of image upscaling will also increase. The biggest draw to consumers of using DLSS will be to extend the life of compatible parts as games become far more advanced. At its core DLSS is a tech that is here to stay, how AMD will answer this is unknown.

8K Resolution will be devastating to the hardware we have at hand with current limitations in technology for gaming as it gradually moves in the future to towards the mainstream market. This is an issue that nVidia is trying to jump ahead of by creating/using DLSS.

The RTX cards out today are literally only there to get the technology out into the wild, they are not end all solutions to this tech and will never be. Generation to generation of hardware releases there have been technological improvements and instruction sets released well prior to its adoption. This is no different in that theme out side of the fact this is a fundamental physical change to the GPU vs more being scaled into a soft variant.

The reasons stated above are also the same reasons generating the tech hate we are seeing piggy back off of a poor release plan. Be patient and wait it out if you wish to not upgrade, that's fine. But in all honesty, the hate out there is mostly a crap load of people just trying to justify why they wont upgrade components. It is fine if you spent $900 on a inflated priced 1080TI last generation at its peak, you don't need to spend money on a costlier 2080TI, but don't go trying to justify it with hate based on misinformation.

People keep hating on the price of entry for the RTX series, this is understandable as it did scale up. But the consumer also needs to know that the price to manufacture also scaled up. nVidia, like AMD, and Intel etc is not a charity. They do not make these products out of the kindness of their hearts to sell at cost.

While I am no fan of nVidia myself as a business, I do own an RTX card in full disclosure and this is not an attempt to validate the purchase. Honestly, this is no different than the hate AMD seen for targeting the budget markets while developing the new Ryzen chips to be competitive again.
ORL is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off