Overclock.net banner

1 - 20 of 91 Posts

·
Registered
Joined
·
3,652 Posts
Discussion Starter #1
https://www.techpowerup.com/252550/nvidia-dlss-and-its-surprising-resolution-limitations

Representatives for the company told us that DLSS is most effective when the GPU is at maximum work load, such that if a GPU is not being challenged enough, DLSS is not going to be made available.
This is the final nail in the coffin.
My primary complaint with DLSS is that the existence of the Tenser Cores on a chip where you are not going to be using Ray Tracing is fundamentally wasteful, that Silicon Area should just be used for more CUDA cores and Nvidia needs to have a high end GTX card available.
Secondly we have seen that basic up-scaling gives almost exactly the same benefits as DLSS, but does so without any need for extra hardware: https://www.techspot.com/article/1712-nvidia-dlss/

Still the idea of DLSS has been lucrative, a 50% boost to framerate with only a small cost to image quality. Many people would have gladly taken that compromise to reach higher framerates...

Only today we find out that DLSS is itself a bottleneck to reaching higher framerates.
 

·
sudo apt install sl
Joined
·
7,319 Posts
Those Tensor cores are there for creative developers not just DLSS. They finally have the hardware they just have to put it to use.
 

·
Registered
Joined
·
3,313 Posts
I still don't know why nVidia can't train with every single game in existence and apply DLSS for everything. That theory that nVidia doesn't want the tensor cores to be the bottleneck makes sense. What's the point of using it when a rendered frame is delayed waiting for the tensor cores to DLSS-it-up. That said, why didn't nVidia make a card where that limitation even has a realistic chance of happening?

Maybe nVidia should have just included it in their "ray-tracing" package and not separated the technology at all.

Early adopter quirks to say the least... pay a premium price for beta features.
 

·
Waiting for 7nm EUV
Joined
·
11,515 Posts
This has all the hallmarks of a niche solution that will be dropped in a few years. It sounds like SLI actually, it needs profiles provided by Nvidia to work, it doesn't support all games and even in the ones that are supported, your mileage may vary. In this case you may not be able to enable it at all depending on the specific RTX card you're using and the resolution you want to play at.

I still want to see the games that support 4K DLSS tested at 1800p + TAA + Upscaling for comparison. Add that in and DLSS becomes even more of a niche feature. 1800p + TAA + Upscaling is a much more universal solution than DLSS will ever be. And you certainly don't need profiles downloaded through GFE to make it work.

The Tensor cores' main job is as a ray tracing denoiser, DLSS is just a way for Nvidia to get more people to register on GFE with their e-mail, but considering the limited horsepower of the combined RT + Tensor solution I wonder if all that die space was used by raster cores if the game devs couldn't have made the ray tracing enabled games look equally good by having more horsepower dedicated to reflections, even when it means duplicating renderings, but it still means that it's general raster hardware doing it, meaning that it can be used elsewhere on the game when needed, it's much more versatile. And from what we are seeing with the new Metro game, making things more realistic doesn't always work in a way that makes a game better gameplay-wise.

Also, they could use Voxel Global Illumination (VXGI) to achieve some of the effects, that is a tech that Nvidia introduced with Maxwell in 2014, so we've got three Nvidia consumer archs capable of doing it:

 

·
Registered
Joined
·
1,209 Posts
They speculate about how the tensor cores are too slow to run on each frame if the framerate is high, but DLSS is marketed as a fast AA (SS) solution. If that's true DLSS's goal and it's implementation contradict itself. It's quite a blunder considering it also look like garbage. Those RTX cards are an erxpensive trainwreck and their main features are pretty overwhelming.
 

·
sudo apt install sl
Joined
·
7,319 Posts

·
Registered
Joined
·
3,313 Posts
This has all the hallmarks of a niche solution that will be dropped in a few years. It sounds like SLI actually, it needs profiles provided by Nvidia to work, it doesn't support all games and even in the ones that are supported, your mileage may vary. In this case you may not be able to enable it at all depending on the specific RTX card you're using and the resolution you want to play at.

I still want to see the games that support 4K DLSS tested at 1800p + TAA + Upscaling for comparison. Add that in and DLSS becomes even more of a niche feature. 1800p + TAA + Upscaling is a much more universal solution than DLSS will ever be. And you certainly don't need profiles downloaded through GFE to make it work.

The Tensor cores' main job is as a ray tracing denoiser, DLSS is just a way for Nvidia to get more people to register on GFE with their e-mail, but considering the limited horsepower of the combined RT + Tensor solution I wonder if all that die space was used by raster cores if the game devs couldn't have made the ray tracing enabled games look equally good by having more horsepower dedicated to reflections, even when it means duplicating renderings, but it still means that it's general raster hardware doing it, meaning that it can be used elsewhere on the game when needed, it's much more versatile. And from what we are seeing with the new Metro game, making things more realistic doesn't always work in a way that makes a game better gameplay-wise.

Also, they could use Voxel Global Illumination (VXGI) to achieve some of the effects, that is a tech that Nvidia introduced with Maxwell in 2014, so we've got three Nvidia consumer archs capable of doing it:

https://www.youtube.com/watch?v=_E1oVl2d01Q
Did I miss something? Where has it been stated that you need GFE for DLSS to work?
 

·
Waiting for 7nm EUV
Joined
·
11,515 Posts
Did I miss something? Where has it been stated that you need GFE for DLSS to work?


https://techreport.com/review/34095/popping-the-hood-on-nvidia-turing-architecture/3
DLSS is just one product of what Nvidia calls the Neural Graphics Framework, or NGX. NGX provides an API to game developers that exposes several AI models, or "neural services," to game engines for use on client PCs. Nvidia pre-trains NGX models in its own data centers and provides them to end users as "neural services" by way of GeForce Experience.

https://www.tomshardware.com/reviews/nvidia-turing-gpu-architecture-explored,5801-5.html
While we hoped Nvidia's GeForce Experience (GFE) software wouldn't be a requisite of DLSS, we suspected it probably would be. Sure enough, the company confirmed that the features of NGX are tightly woven into GFE. If the software detects a Turing-based GPU, it downloads a package called NGX Core, which determines if games/apps are relevant to NGX. When there's a match, NGX Core retrieves any associated deep neural networks for later use.

https://www.pcper.com/reviews/Graphics-Cards/Architecture-NVIDIAs-RTX-GPUs-Turing-Explored/RTX-Features-Ray-Tracing-and-DL
This neural network model is then distributed via GeForce Experience to end users who have a GPU with tensor cores and have the given game installed. This distribution model is vital as it allows NVIDIA to silently update the model in the background as they come up with improvements as they get more experience and come up with better techniques.
 

·
Zen
Joined
·
1,107 Posts
They speculate about how the tensor cores are too slow to run on each frame if the framerate is high, but DLSS is marketed as a fast AA (SS) solution. If that's true DLSS's goal and it's implementation contradict itself. It's quite a blunder considering it also look like garbage. Those RTX cards are an erxpensive trainwreck and their main features are pretty overwhelming.
That would defeat the purpose of DLSS, since its purpose is to improve the performance on game on high resolution while minimizing the loss of quality vs native resolution. If you are already running high framerate on native resolution, you wouldn't need DLSS.
 

·
Registered
Joined
·
3,313 Posts

·
Waiting for 7nm EUV
Joined
·
11,515 Posts
That's odd since it works in BFV for me without GFE installed.

Maybe they quietly changed their minds, they are masters at that. Or maybe they intend to enforce it later on, but eased the system for these first few titles, given the general backlash against Turing's high prices and lackluster RTX performance, to avoid having people complain about one more walled garden barrier to entry to use the new features, even more so after having waited for so long to see them in action in actual games.

As I said back then when the sites quoted above reported on it, there is no technical reason for Nvidia to require GFE to get the game specific DLSS packages, it's a telemetry and marketing move for them above all. If the DLSS packages are small they can be packed with the game ready drivers like SLI profiles; if they are relatively big, like 100 MB of more (seems more likely), they can be optional downloads from their site next to the drivers or, much more practical and with a lot more sense to it, they can ship the finished package to the game dev and then it gets shipped with the game at launch or as part of an update - people already download several GB's worth of game updates, a DLSS package that is 100-200 MB big is a drop in the ocean.
 

·
Registered
Joined
·
3,313 Posts
Yeah, I think maybe, initially, they were not totally sure how they would distribute the profiles.
 

·
professional curmudgeon
Joined
·
10,391 Posts
Maybe they quietly changed their minds, they are masters at that. Or maybe they intend to enforce it later on, but eased the system for these first few titles, given the general backlash against Turing's high prices and lackluster RTX performance, to avoid having people complain about one more walled garden barrier to entry to use the new features, even more so after having waited for so long to see them in action in actual games.

As I said back then when the sites quoted above reported on it, there is no technical reason for Nvidia to require GFE to get the game specific DLSS packages, it's a telemetry and marketing move for them above all. If the DLSS packages are small they can be packed with the game ready drivers like SLI profiles; if they are relatively big, like 100 MB of more (seems more likely), they can be optional downloads from their site next to the drivers or, much more practical and with a lot more sense to it, they can ship the finished package to the game dev and then it gets shipped with the game at launch or as part of an update - people already download several GB's worth of game updates, a DLSS package that is 100-200 MB big is a drop in the ocean.
or how about the press was wrong? like i said they were back then?? :rolleyes:

yeah simple answers are best.
 

·
Joined
·
2,327 Posts
Nvidia is pushing console peasantry into PC space. Upscaling is blasphemy.
 

·
Waiting for 7nm EUV
Joined
·
11,515 Posts
or how about the press was wrong? like i said they were back then?? :rolleyes:

yeah simple answers are best.

The press wasn't wrong, they didn't make it up on their own and Tom's Hardware specifically confirmed it with Nvidia.

https://www.tomshardware.com/reviews/nvidia-turing-gpu-architecture-explored,5801-5.html
While we hoped Nvidia's GeForce Experience (GFE) software wouldn't be a requisite of DLSS, we suspected it probably would be. Sure enough, the company confirmed that the features of NGX are tightly woven into GFE. If the software detects a Turing-based GPU, it downloads a package called NGX Core, which determines if games/apps are relevant to NGX. When there's a match, NGX Core retrieves any associated deep neural networks for later use.


You know where these three sites got their info from to begin with? From none other than Nvidia themselves in their Turing architecture whitepaper (page 33):

https://www.nvidia.com/content/dam/en-zz/Solutions/design-visualization/technologies/turing-architecture/NVIDIA-Turing-Architecture-Whitepaper.pdf

NGX SOFTWARE ARCHITECTURE

The features of NGX tightly couple to the NVIDIA driver and hardware. The NGX API provides access to several AI features for games and applications. The features are pre-trained by NVIDIA and ready for integration. The API has been designed to be thin and easy for applications to integrate multiple AI features. NGX services run on the GPU, allowing it to support multiple features and applications.

NVIDIA NGX features are managed by the NVIDIA GeForce Experience™ (GFE) application or the tech preview version of the NVIDIA Quadro Experience™ (QXP) application. After GFE or QXP is installed or updated, it looks for the presence of a Turing GPU. Once detected, the NGX Corepackage is downloaded and installed. GFE/QXP communicates with NGX Core to determine the game and application IDs present and their relevance to NGX. Different DNN models that work with various installed games and applications are then downloaded for subsequent use.

NGX DNN models can interface with CUDA 10, the DirectX and Vulkan drivers, as well as take advantage of NVIDIA TensorRT™, the high-performance deep learning inference optimizer that delivers low latency and high-throughput for deep learning inference applications. NGX models and services are accelerated by Turing’s enhanced Tensor Cores.
 

·
professional curmudgeon
Joined
·
10,391 Posts
The press wasn't wrong, they didn't make it up on their own and Tom's Hardware specifically confirmed it with Nvidia.

https://www.tomshardware.com/reviews/nvidia-turing-gpu-architecture-explored,5801-5.html

You know where these three sites got their info from to begin with? From none other than Nvidia themselves in their Turing architecture whitepaper (page 33):

https://www.nvidia.com/content/dam/en-zz/Solutions/design-visualization/technologies/turing-architecture/NVIDIA-Turing-Architecture-Whitepaper.pdf
yeah sure, whatever. i told you before the driver package would be the same and GFE wouldn't be needed. i explained in detail what you proposed, from what you read from the press, was WRONG.

but yeah sure, whatever. keep the FUD train going. :thumb:
 

·
Waiting for 7nm EUV
Joined
·
11,515 Posts
Are you trolling? I just linked you to Nvidia's whitepaper and provided the quotes proving that you are flat out wrong.
 

·
professional curmudgeon
Joined
·
10,391 Posts
Are you trolling? I just linked you to Nvidia's whitepaper and provided the quotes proving that you are flat out wrong.
reality says different, doesn't it? you don't need GFE to use DLSS.

you presented a statement as fact when it is very much wrong. had you bothered to take action yourself instead relying on the same sources you, yourself has criticized, both the press and nvidia, you would have know it to be wrong. i guess its ok to use them as a shield when it fits, eh?

but no, you would rather not following up on anything and instead spread some garbage.

no wonder this sub forum turns into toxic waste water.
 

·
Performance is the bible
Joined
·
7,049 Posts
Are you trolling? I just linked you to Nvidia's whitepaper and provided the quotes proving that you are flat out wrong.
He has a conspiracy theory that all medias are wrong.
He showed that in other threads as well, basically claiming that he alone is the master of all truth.
 

·
What should be here ?
Joined
·
5,570 Posts
reality says different, doesn't it? you don't need GFE to use DLSS.

you presented a statement as fact when it is very much wrong. had you bothered to take action yourself instead relying on the same sources you, yourself has criticized, both the press and nvidia, you would have know it to be wrong. i guess its ok to use them as a shield when it fits, eh?

but no, you would rather not following up on anything and instead spread some garbage.

no wonder this sub forum turns into toxic waste water.
He isn't wrong that entire shebang about DLSS, from the onset, was quoted to work through GFE and the details came from the horses mouth. Unless you have source who had confirmed to you while at launch of the RTX series that GFE wouldn't be a pre-requisite for DLSS while the media ran with the article claiming the contrary.
 
1 - 20 of 91 Posts
Top