[nVidia] NVIDIA DLSS: Your Questions, Answered - Page 2 - Overclock.net - An Overclocking Community

Forum Jump: 

[nVidia] NVIDIA DLSS: Your Questions, Answered

Reply
 
Thread Tools
post #11 of 33 (permalink) Old 02-26-2019, 03:05 PM
New to Overclock.net
 
8051's Avatar
 
Join Date: Apr 2014
Posts: 2,530
Rep: 20 (Unique: 14)
Quote: Originally Posted by DNMock View Post
DLSS particularly? about 99.9% sure it will be sitting next to PhysX in the bleachers.

utilization of compute cores though, that's a different story. I wouldn't be too surprised to see the number of GPU/CUDA cores remain the same for a while and slowly dwindle away while the number of compute and ray tracing specific cores continues to grow
If AMD begins increasing the number of TMU's/ROPs/shaders instead of going w/ray tracing and compute cores I'll buy their products, because I'm more interested in high resolutions, HDR and high framerates than compute cores or ray tracing. It could be said AMD drives what games developers do because AMD controls the console market. Why put any development time into DLSS or ray tracing when consoles don't support it? Does Nvidia pay or otherwise remunerate games developers to incorporate Nvidia tech into their games?
8051 is offline  
Sponsored Links
Advertisement
 
post #12 of 33 (permalink) Old 02-27-2019, 07:05 AM
New to Overclock.net
 
DNMock's Avatar
 
Join Date: Jul 2014
Location: Dallas
Posts: 3,159
Rep: 158 (Unique: 117)
Quote: Originally Posted by 8051 View Post
If AMD begins increasing the number of TMU's/ROPs/shaders instead of going w/ray tracing and compute cores I'll buy their products, because I'm more interested in high resolutions, HDR and high framerates than compute cores or ray tracing. It could be said AMD drives what games developers do because AMD controls the console market. Why put any development time into DLSS or ray tracing when consoles don't support it? Does Nvidia pay or otherwise remunerate games developers to incorporate Nvidia tech into their games?
Absolutely.


DNMock is offline  
post #13 of 33 (permalink) Old 02-27-2019, 11:06 AM
Overclocker
 
JackCY's Avatar
 
Join Date: Jun 2014
Posts: 8,720
Rep: 285 (Unique: 210)
Quote: Originally Posted by ToTheSun! View Post
In theory, it can never be as good because the ground truth is not of any game you might be playing. It should be fine, though, for a lot of other content. In any case, it's hard to imagine the small chip inside TV's being as good as nVidia's tensor cores at their intended usage.
Well there are quite a few high quality upscalers it's only games and GPU drivers nor monitors that offer them. Where as with TVs they are more competitive and do offer these sort of image altering algorithms that don't often have much use on monitors except "gaming" monitors to play 1440p on 4k panel. TVs probably have nowadays fairly sophisticated even AI driven features in their purpose built processors.
A reasonable scaling: Jinc? SuperXBR? NGU? yeah none of these found in games etc. Probably not even Lanczos. Often they either do shader resolution change or if it's a true target resolution change then they use some crappy bicubic scaling that is offered by what ever middleware they use.

DLSS not being beneficial to be used at high FPS is a major drawback.

---

Oh yes Nvidia throws money/resources/developers at game studios to use their middleware/crapworks and get their logo in a game, optimize the game for their hardware and not competitor, ...

The idea behind DLSS nor tracing is bad but the performance and quality is the way it is being implemented by Nvidia right now. Even Q2VKPT... you may think it's some random dude that made it, but as far as I remember, if you look it up it's a researcher that works with Nvidia.

Last edited by JackCY; 02-27-2019 at 11:10 AM.
JackCY is offline  
Sponsored Links
Advertisement
 
post #14 of 33 (permalink) Old 02-27-2019, 11:22 AM
professional curmudgeon
 
looniam's Avatar
 
Join Date: Apr 2009
Posts: 9,051
Rep: 761 (Unique: 441)
Quote: Originally Posted by JackCY View Post
The idea behind DLSS nor tracing is bad but the performance and quality is the way it is being implemented by Nvidia right now. Even Q2VKPT... you may think it's some random dude that made it, but as far as I remember, if you look it up it's a researcher that works with Nvidia.
he's a grad student that interned at NV:

https://cg.ivd.kit.edu/english/schied/index.php

Quote:
08/2016 - 01/2017 Internship at NVIDIA Research
01/2014 - 12/2015 Stipend Landesgraduiertenförderung
Since 11/2013 Researcher / Ph.D. student at Computer Graphics Lab, KIT
10/2013 Diploma with Honors in Computer Science at Ulm University
04/2012 - 08/2012, 10/2012 - 02/2013, 08/2013 - 09/2013 Research assistant, Ulm University - Institute of Embedded Systems/Real-Time Systems
03/2010 - 09/2011 Research assistant, Ulm University - Institute of Media Informatics
08/2008 - 02/2010 Research assistant, Ulm University
it only took 3 seconds to google.

"Name as many uses for a brick as you can in one minute." - interview at graphics-chip maker Nvidia for a campaign-manager job
Fermi: it's better to burn out than fade away.
Remember the golden rule of statistics: A personal sample size of one is a sufficient basis upon which to draw universal conclusions.
"The more you buy, the more you save." - Jensen Huang GTC 2018
loon 3.2
(18 items)
CPU
i7-3770K
Motherboard
Asus P8Z77-V Pro
GPU
EVGA 980TI SC+
RAM
16Gb PNY ddr3 1866
Hard Drive
PNY 1311 240Gb
Hard Drive
1 TB Seagate
Hard Drive
3 TB WD Blue
Optical Drive
DVD DVDRW+/-
Power Supply
EVGA SuperNova 750 G2
Cooling
EKWB P280 kit
Cooling
EK-VGA supremacy
Case
Stryker M [hammered and drilled]
Operating System
Win X
Monitor
LG 24MC57HQ-P
Keyboard
Ducky Zero [blues]
Mouse
corsair M65
Audio
SB Recon3D
Audio
Klipsch ProMedia 2.1
▲ hide details ▲


looniam is offline  
post #15 of 33 (permalink) Old 02-27-2019, 12:50 PM
New to Overclock.net
 
white owl's Avatar
 
Join Date: Apr 2015
Location: The land of Nod
Posts: 5,165
Rep: 125 (Unique: 95)
I have no doubt that they can make the games look better but when the GPU costs $800 to $1200 and it takes THIS long to optimize TWO games, what faith can we have in the future of this tech?
I expected the GPU to be able to do this on it's own as you play the game and always be learning and getting better (to some extent).

Quote: Originally Posted by SpeedyVT
If you're not doing extreme things to parts for the sake of extreme things regardless of the part you're not a real overclocker.
Quote: Originally Posted by doyll View Post
The key is generally not which brands are good but which specific products are. Motherboards and GPUs are perfect examples of companies having everything from golden to garbage function/quality.
Hot n Bothered
(12 items)
CPU
4790k 4.7Ghz
Motherboard
Asus Sabertooth Z97 MkII 2
GPU
EVGA GTX 1080 SC
RAM
16gb G.Skill Sniper 2400Mhz
Hard Drive
2x Kingston v300 120gb RAID 0
Hard Drive
WD Blue
Power Supply
Seasonic 620w M12 II EVO
Cooling
Cooler Master 212 Evo
Case
Corsair 450D
Operating System
Windows 10
Monitor
Nixeus EDG27
Other
I have pretty lights.
▲ hide details ▲
white owl is offline  
post #16 of 33 (permalink) Old 02-27-2019, 01:52 PM
New to Overclock.net
 
8051's Avatar
 
Join Date: Apr 2014
Posts: 2,530
Rep: 20 (Unique: 14)
Quote: Originally Posted by white owl View Post
I have no doubt that they can make the games look better but when the GPU costs $800 to $1200 and it takes THIS long to optimize TWO games, what faith can we have in the future of this tech?
I expected the GPU to be able to do this on it's own as you play the game and always be learning and getting better (to some extent).
I'm hoping Nvidia continues wasting GPU die space on ray tracing and compute cores for their gaming segment maybe AMD can leverage this to out-perform Nvidia in places where gaming performance really matters (HDR and high FPS gaming for 144MHz.+ monitors).
8051 is offline  
post #17 of 33 (permalink) Old 02-27-2019, 02:46 PM
New to Overclock.net
 
DNMock's Avatar
 
Join Date: Jul 2014
Location: Dallas
Posts: 3,159
Rep: 158 (Unique: 117)
Quote: Originally Posted by 8051 View Post
I'm hoping Nvidia continues wasting GPU die space on ray tracing and compute cores for their gaming segment maybe AMD can leverage this to out-perform Nvidia in places where gaming performance really matters (HDR and high FPS gaming for 144MHz.+ monitors).
I hope Nvidia does whatever needed to improve the visual quality and frame rates of games to the best of their ability and that AMD catches up in performance forcing competitive pricing and continues to drive innovation.





If you are gonna dream, dream big.


DNMock is offline  
post #18 of 33 (permalink) Old 02-27-2019, 03:03 PM
New to Overclock.net
 
white owl's Avatar
 
Join Date: Apr 2015
Location: The land of Nod
Posts: 5,165
Rep: 125 (Unique: 95)
I'm not sure how this is so hard for so many to understand. These components weren't placed there for ray tracing or DLSS, they were simply repurposed to do those things. If they wanted to sell 4 figure cards to researchers, they needed special hardware to do it. We're the people who don't give a damn if a few bits don't work, who don't need most of those features, who don't care about power consumption as long as performance is there.
The only place they messed up was charging more for features that were already there and that no one asked for. Sales reflect that, if you're going to sell someone a broken/downgraded part you need to charge less for it.

Quote: Originally Posted by SpeedyVT
If you're not doing extreme things to parts for the sake of extreme things regardless of the part you're not a real overclocker.
Quote: Originally Posted by doyll View Post
The key is generally not which brands are good but which specific products are. Motherboards and GPUs are perfect examples of companies having everything from golden to garbage function/quality.
Hot n Bothered
(12 items)
CPU
4790k 4.7Ghz
Motherboard
Asus Sabertooth Z97 MkII 2
GPU
EVGA GTX 1080 SC
RAM
16gb G.Skill Sniper 2400Mhz
Hard Drive
2x Kingston v300 120gb RAID 0
Hard Drive
WD Blue
Power Supply
Seasonic 620w M12 II EVO
Cooling
Cooler Master 212 Evo
Case
Corsair 450D
Operating System
Windows 10
Monitor
Nixeus EDG27
Other
I have pretty lights.
▲ hide details ▲
white owl is offline  
post #19 of 33 (permalink) Old 02-27-2019, 03:04 PM - Thread Starter
sudo apt install sl
 
WannaBeOCer's Avatar
 
Join Date: Dec 2009
Posts: 4,020
Rep: 140 (Unique: 101)
Quote: Originally Posted by white owl View Post
I have no doubt that they can make the games look better but when the GPU costs $800 to $1200 and it takes THIS long to optimize TWO games, what faith can we have in the future of this tech?
I expected the GPU to be able to do this on it's own as you play the game and always be learning and getting better (to some extent).
Only took them 7 days to fix DLSS image quality on Metro Exodus which will continue to improve. While Battlefield V is just an issue with EA DICE's garbage Frostbite DX12 engine. We've seen the same broken engine in Battlefield 1 and it's still not fixed. AMD/nVidia provide the hardware, it's up to the developers to properly utilize it or contact nVidia/AMD to learn how to use the hardware features. AMD/ATi and nVidia usually release new cards that support the latest DirectX API around the same time. This time around AMD is focusing on actually making money with their continued growth with their datacenter GPUs.

Quote: Originally Posted by white owl View Post
I'm not sure how this is so hard for so many to understand. These components weren't placed there for ray tracing or DLSS, they were simply repurposed to do those things. If they wanted to sell 4 figure cards to researchers, they needed special hardware to do it. We're the people who don't give a damn if a few bits don't work, who don't need most of those features, who don't care about power consumption as long as performance is there.
The only place they messed up was charging more for features that were already there and that no one asked for. Sales reflect that, if you're going to sell someone a broken/downgraded part you need to charge less for it.
RT cores were placed on the cards for Ray Tracing for gaming. They've been working on hybrid ray tracing since 2008 with their OptiX API for gaming. It's just another step just like Tessellation, when DirectX 11 cards were released they were no were barely touching 60 FPS at 1080p.

You can see in this video he mentions that it's a step toward gaming:


Maximus
(21 items)
CPU
Core i7 6700K 4.8Ghz @ 1.4v
Motherboard
Maximus VIII Formula
GPU
Radeon VII @ 1950Mhz/1200Mhz w/ 1070mV
RAM
G-Skill 32GB 3200Mhz
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 500GB
Power Supply
EVGA SuperNova 1200w P2
Cooling
EK Supremacy Full Copper Clean
Cooling
XSPC D5 Photon v2
Cooling
Black Ice Gen 2 GTX360 x2
Case
Thermaltake Core X5 Tempered Glass Edition
Operating System
Kubuntu 18.04.1
Operating System
Hackintosh macOS 10.14.3
Operating System
Windows 10 Pro
Monitor
Acer XF270HUA
Keyboard
Cherry MX Board 6.0
Mouse
Logitech G600
Mouse
Alugraphics GamerArt
Audio
Definitive Technology Incline
Audio
SMSL M8A
▲ hide details ▲

Last edited by WannaBeOCer; 02-27-2019 at 03:08 PM.
WannaBeOCer is online now  
post #20 of 33 (permalink) Old 02-27-2019, 03:16 PM
New to Overclock.net
 
white owl's Avatar
 
Join Date: Apr 2015
Location: The land of Nod
Posts: 5,165
Rep: 125 (Unique: 95)
Quote: Originally Posted by WannaBeOCer View Post
Only took them 7 days to fix DLSS image quality on Metro Exodus which will continue to improve. While Battlefield V is just an issue with EA DICE's garbage Frostbite DX12 engine. We've seen the same broken engine in Battlefield 1 and it's still not fixed. AMD/nVidia provide the hardware, it's up to the developers to properly utilize it or contact nVidia/AMD to learn how to use the hardware features. AMD/ATi and nVidia usually release new cards that support the latest DirectX API around the same time. This time around AMD is focusing on actually making money with their continued growth with their datacenter GPUs.



RT cores were placed on the cards for Ray Tracing for gaming. They've been working on hybrid ray tracing since 2008 with their OptiX API for gaming. It's just another step just like Tessellation, when DirectX 11 cards were released they were no were barely touching 60 FPS at 1080p.

You can see in this video he mentions that it's a step toward gaming:

https://www.youtube.com/watch?v=oK4UGnwwuEM
Only 7 days? So Nvidia doesn't help with the coding of the game at all? They had no idea DLSS didn't work on launch day? They had no access to the beta or even alpha to ensure that their customers would be satisfied with not only their hardware but their software? Didn't they say they'd be working with devs to help implement these features? If this is to be adopted in many games how are they going to cope with that? Or is this just smoke and mirrors to sell hardware that they needed for something else entirely?
Oh yeah and they only improved a single resolution. Wth? Is 1080p going to take another 7? There are so many different resolutions and aspect ratios for which they sold different teirs of cards. Maybe they helped the 2080ti guys here but what about the 2060? They wait another 7 days while Nvidia sorts out something that should have been sorted out on launch day?

Quote: Originally Posted by SpeedyVT
If you're not doing extreme things to parts for the sake of extreme things regardless of the part you're not a real overclocker.
Quote: Originally Posted by doyll View Post
The key is generally not which brands are good but which specific products are. Motherboards and GPUs are perfect examples of companies having everything from golden to garbage function/quality.
Hot n Bothered
(12 items)
CPU
4790k 4.7Ghz
Motherboard
Asus Sabertooth Z97 MkII 2
GPU
EVGA GTX 1080 SC
RAM
16gb G.Skill Sniper 2400Mhz
Hard Drive
2x Kingston v300 120gb RAID 0
Hard Drive
WD Blue
Power Supply
Seasonic 620w M12 II EVO
Cooling
Cooler Master 212 Evo
Case
Corsair 450D
Operating System
Windows 10
Monitor
Nixeus EDG27
Other
I have pretty lights.
▲ hide details ▲
white owl is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off