Overclock.net - An Overclocking Community

Overclock.net - An Overclocking Community (https://www.overclock.net/forum/)
-   Software News (https://www.overclock.net/forum/226-software-news/)
-   -   [nVidia] NVIDIA DLSS: Your Questions, Answered (https://www.overclock.net/forum/226-software-news/1721282-nvidia-nvidia-dlss-your-questions-answered.html)

WannaBeOCer 02-24-2019 01:39 PM

[nVidia] NVIDIA DLSS: Your Questions, Answered
 
Source: https://www.nvidia.com/en-us/geforce...ions-answered/

Quote:

Q: When’s the next DLSS update for Battlefield V and Metro Exodus?

A: We are constantly working to improve image quality. Recently we updated the core of DLSS so that you get the latest model updates the moment you launch your game. So make sure you have our latest Game Ready Driver (418.91 or higher) installed.

For Battlefield V, we think DLSS delivers a big improvement in 4K and 2560x1440 performance -- up to 40% -- for the corresponding quality, but also hear the community.  For the next push, we are focusing our testing and training to improve the image quality at 1920x1080 and also for ultrawide monitors (e.g. 3440x1440). The current experience at these resolutions is not where we want them.

For Metro Exodus, we’ve got an update coming that improves DLSS sharpness and overall image quality across all resolutions that didn’t make it into day of launch. We’re also training DLSS on a larger cross section of the game, and once these updates are ready you will see another increase in quality. Lastly, we are looking into a few other reported issues, such as with HDR, and will update as soon as we have fixes.

ILoveHighDPI 02-24-2019 03:12 PM

I can’t wait to see how Samsung’s AI works with gaming on the 4K QLED TVs.
Nvidia isn’t the only option for this tech, and running the upscaling on your display means you have a multi chip solution which is theoretically more cost efficient.
Not that the 4K QLEDs are cheap, but as a precedent for the future if a separate chip can do the same thing as DLSS then that is clearly the best option.

ToTheSun! 02-24-2019 06:20 PM

Quote:

Originally Posted by ILoveHighDPI (Post 27867418)
I can’t wait to see how Samsung’s AI works with gaming on the 4K QLED TVs.
Nvidia isn’t the only option for this tech, and running the upscaling on your display means you have a multi chip solution which is theoretically more cost efficient.
Not that the 4K QLEDs are cheap, but as a precedent for the future if a separate chip can do the same thing as DLSS then that is clearly the best option.

In theory, it can never be as good because the ground truth is not of any game you might be playing. It should be fine, though, for a lot of other content. In any case, it's hard to imagine the small chip inside TV's being as good as nVidia's tensor cores at their intended usage.

Buris 02-25-2019 11:36 AM

Quote:

Originally Posted by ToTheSun! (Post 27867650)
In theory, it can never be as good because the ground truth is not of any game you might be playing. It should be fine, though, for a lot of other content. In any case, it's hard to imagine the small chip inside TV's being as good as nVidia's tensor cores at their intended usage.

Nvidia's tensor cores are just compute units-

See: Marseille M-cable- better results with an actual INCREASE in image quality.

ILoveHighDPI 02-25-2019 12:22 PM

Quote:

Originally Posted by ToTheSun! (Post 27867650)
In theory, it can never be as good because the ground truth is not of any game you might be playing. It should be fine, though, for a lot of other content. In any case, it's hard to imagine the small chip inside TV's being as good as nVidia's tensor cores at their intended usage.

Tenser cores are only a small part of RTX cards. The size of the Samsung chip would not need to be large at all.
And like I was saying, Samsung is only selling this in high margin TVs right now, they can put as much processing power in there as they want.

I doubt Nvidia is doing anything overly complicated to get their “Ground Truth” for games either, all you have to do is bump up the game resolution and record some video, images from the PC version would carry over to consoles just fine.
You could even use emulators to upscale Nintendo games.

Hopefully game companies make it standard practice to create an AI training playlist and send it everyone doing this sort of thing.

DNMock 02-25-2019 01:56 PM

Quote:

Originally Posted by ToTheSun! (Post 27867650)
In theory, it can never be as good because the ground truth is not of any game you might be playing. It should be fine, though, for a lot of other content. In any case, it's hard to imagine the small chip inside TV's being as good as nVidia's tensor cores at their intended usage.

as was already mentioned, tensor cores are just compute cores. With DLSS they aren't doing the job they were designed for as they are simply bleed-over from the enterprise Tesla and Quaddro cards. DLSS is a way to give them something to do rather than just wasting space on the die. Those TV chips were designed ground up with their specific job in mind, so ya, those TV chips are probably way better at it.

ToTheSun! 02-25-2019 02:24 PM

Quote:

Originally Posted by DNMock (Post 27868868)
as was already mentioned, tensor cores are just compute cores. With DLSS they aren't doing the job they were designed for as they are simply bleed-over from the enterprise Tesla and Quaddro cards. DLSS is a way to give them something to do rather than just wasting space on the die. Those TV chips were designed ground up with their specific job in mind, so ya, those TV chips are probably way better at it.

Alright, but ground truth is still not game specific, unlike DLSS.

8051 02-25-2019 09:06 PM

Quote:

Originally Posted by Buris (Post 27868692)
Nvidia's tensor cores are just compute units-

See: Marseille M-cable- better results with an actual INCREASE in image quality.

And what kind of latency does this introduce?

What are the odds that 8 years from now DLSS will be joining physX on the dust pile of obsolescence? AMD certainly won't be embracing it.

Silent Scone 02-26-2019 08:11 AM

Quote:

Originally Posted by 8051 (Post 27869416)
And what kind of latency does this introduce?

What are the odds that 8 years from now DLSS will be joining physX on the dust pile of obsolescence? AMD certainly won't be embracing it.

AMD don't tend to embrace anything, they don't have the resources to do so on a grandiose scale. The only reason we see an influx in uptake on NVIDIA technologies initially is because of investment.

DNMock 02-26-2019 09:13 AM

Quote:

Originally Posted by 8051 (Post 27869416)
And what kind of latency does this introduce?

What are the odds that 8 years from now DLSS will be joining physX on the dust pile of obsolescence? AMD certainly won't be embracing it.

DLSS particularly? about 99.9% sure it will be sitting next to PhysX in the bleachers.

utilization of compute cores though, that's a different story. I wouldn't be too surprised to see the number of GPU/CUDA cores remain the same for a while and slowly dwindle away while the number of compute and ray tracing specific cores continues to grow


All times are GMT -7. The time now is 01:31 AM.

Powered by vBulletin® Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.

User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.