Overclock.net - An Overclocking Community - Reply to Topic

Thread: [Techpowerup] NVIDIA DLSS and its Surprising Resolution Limitations Reply to Thread
Title:
Message:

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in


  Additional Options
Miscellaneous Options

  Topic Review (Newest First)
02-20-2019 08:50 PM
Telimektar
Quote: Originally Posted by Kaltenbrunner View Post
In typical games I guess not much. Nviada used them so that they can make a GPU they could sell to gamer's AND to scientists/etc

What is DLSS? Is it new AA ?
From what I understand it's and advanced method of upscaling/supersampling using deep learning (it's stand for Deep Learning Super Sampling after all) which is supposed to improve framerates without sacrificing image quality (or even enhance it if you believe Nvidia). When you choose 1440p in your game and activate DLSS, the game actually runs internally in 1080p and supersample it to 1440p, same thing if you select 4K and activate DLSS the game runs internally in 1440p and super sample/upscales the image to 4K, at least that's how I understand it, I think it also is a kind of AA too i'm not sure.

Problem is only 3 games support it right now (Battlefield V, Metro Exodus and FFXV) and in BFV and Metro it looks far more like a blur filter than anything, only in FFXV does it seem to look better than regular 4K+TAA but that's because the TAA implementation in FFXV is quite terrible and blurry.

Here you can see how terrible it is right now at least :


And here's an article that explain the technology far better than I ever could :

https://www.tomshardware.com/reviews...-rtx,5870.html
02-20-2019 07:21 PM
GHADthc
Quote: Originally Posted by ToTheSun! View Post
The AI developed a conscience and started caring about the environment and energy expenditure. It also reduces framerate because it heard Jensen saying that amount of performance is irresponsible.
Man...if we could still REP posts...
02-20-2019 07:01 PM
Kaltenbrunner
Quote: Originally Posted by Telimektar View Post
What can you do with those RTX and Tensor cores besides DLSS and RTX effects, I'm still not clear on that ? I wonder if console and arcade emulators could use them in some way.
In typical games I guess not much. Nviada used them so that they can make a GPU they could sell to gamer's AND to scientists/etc

What is DLSS? Is it new AA ?
02-20-2019 06:05 PM
tpi2007
Quote: Originally Posted by Crinn View Post
Pretty sure the DLSS profile is part of the driver, otherwise those of us without GFE wouldn't be able to use DLSS. Nvidia is probably just pushing GFE as a automatic driver updater again.

Right now it's part of either the driver or the games themselves, but beware that they didn't erase that paragraph I quoted from the Turing whitepaper, so once there is more than a handful of games (right now we have three games and one benchmark), they might try to bring it under the GFE umbrella by claiming that they can use the same Deep Neural Network model for more than one game, thus they would phrase this centralization as an efficiency measure to justify the move. It's a possible justification if you read the whitepaper and incidentally what some claimed back in August of last year to justify GFE handling DLSS profiles.

That's all assuming that they can turn the DLSS trainwreck around first and make it something worthwhile, otherwise people won't care whether or where it's available.
02-20-2019 05:33 PM
Crinn
Quote: Originally Posted by tpi2007 View Post
Maybe they quietly changed their minds, they are masters at that. Or maybe they intend to enforce it later on, but eased the system for these first few titles, given the general backlash against Turing's high prices and lackluster RTX performance, to avoid having people complain about one more walled garden barrier to entry to use the new features, even more so after having waited for so long to see them in action in actual games.

As I said back then when the sites quoted above reported on it, there is no technical reason for Nvidia to require GFE to get the game specific DLSS packages, it's a telemetry and marketing move for them above all. If the DLSS packages are small they can be packed with the game ready drivers like SLI profiles; if they are relatively big, like 100 MB of more (seems more likely), they can be optional downloads from their site next to the drivers or, much more practical and with a lot more sense to it, they can ship the finished package to the game dev and then it gets shipped with the game at launch or as part of an update - people already download several GB's worth of game updates, a DLSS package that is 100-200 MB big is a drop in the ocean.

Pretty sure the DLSS profile is part of the driver, otherwise those of us without GFE wouldn't be able to use DLSS. Nvidia is probably just pushing GFE as a automatic driver updater again.
02-20-2019 01:55 PM
DNMock
Quote: Originally Posted by tpi2007 View Post
The 12nm that they are using is just a slight improvement over 16nm, it probably doesn't amount to that much. Also, GV100 actually has 84 SMs, but we only ever saw products with 80 enabled due to yield and probably also power consumption reasons, so take that into account in your math. Also, there is a difference in that GV100 uses an HBM2 memory controller, thus saving a bit of die space compared to the GDDR6 memory controller equipped Turing dies.
Oh I'm sure there are a heck of a lot of things I got wrong there. Wouldn't be surprised if my basic math was wrong in spots lol.
02-20-2019 10:59 AM
tpi2007
Quote: Originally Posted by doritos93 View Post
So HWU has proven that this is garbage tech ATM. Can we stop calling this a feature now
https://youtu.be/3DOGA2_GETQ

I made a thread about it, including that video and the accompanying written article:

https://www.overclock.net/forum/226-...delivered.html
02-20-2019 09:56 AM
doritos93 So HWU has proven that this is garbage tech ATM. Can we stop calling this a feature now
02-20-2019 09:46 AM
Leopardi
Quote: Originally Posted by ILoveHighDPI View Post
https://www.techpowerup.com/252550/n...on-limitations



This is the final nail in the coffin.
My primary complaint with DLSS is that the existence of the Tenser Cores on a chip where you are not going to be using Ray Tracing is fundamentally wasteful, that Silicon Area should just be used for more CUDA cores and Nvidia needs to have a high end GTX card available.
Secondly we have seen that basic up-scaling gives almost exactly the same benefits as DLSS, but does so without any need for extra hardware: https://www.techspot.com/article/1712-nvidia-dlss/

Still the idea of DLSS has been lucrative, a 50% boost to framerate with only a small cost to image quality. Many people would have gladly taken that compromise to reach higher framerates...

Only today we find out that DLSS is itself a bottleneck to reaching higher framerates.
The 25% scaling + 2xMSAA technique that was used in Rainbow Six: Siege was equal to having a bit lower resolution on a CRT. Much better than this DLSS will ever be, the image was razor native sharp without any loss or ghosting artifacts.
02-20-2019 09:40 AM
tpi2007 There is a site that from time to time posts detailed views of dies using speciality equipment (electron microscopes, if I'm not mistaken). I don't recall the name right now. That could eventually help if they happened to get their hands on Volta and Turing.
This thread has more than 10 replies. Click here to review the whole thread.

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off