[TechSpot] Nvidia DLSS in 2020: Stunning Results - Page 4 - Overclock.net - An Overclocking Community
Forum Jump: 

[TechSpot] Nvidia DLSS in 2020: Stunning Results

Reply
 
Thread Tools
post #31 of 49 (permalink) Old 02-27-2020, 07:31 PM
Vermin Supreme 2020
 
skupples's Avatar
 
Join Date: Apr 2012
Location: Bradentucky
Posts: 25,170
Rep: 732 (Unique: 385)
GTX is dead, at least as far as gamers are concerned. the entire line will be RTX next time around, except for stupidly low end stuff 99.99% of gamers aren't looking at using. Maybe a 2660ti, MAYBE.

Quote: Originally Posted by ZealotKi11er View Post
I got RTX 2080 Ti for DXR and sold it because it was just a tech Demo. I envy people that have fast GPUs and dont run 4K.
I had to retire my 4K60 screen... though half of it glitching for a split second, every so often, helped nudge me that 3440x1440p direction. i'm missing the space while flying around in everspace though. I always catch myself trying to look up in games.

i'm planning to get the 48 inch LG OLED though.

Add me on Steam, same name
R.I.P. Zawarudo, may you OC angels' wings in heaven.
If something appears too good to be true, it probably is.
skupples is offline  
Sponsored Links
Advertisement
 
post #32 of 49 (permalink) Old 02-27-2020, 08:30 PM
New to Overclock.net
 
Join Date: Mar 2018
Posts: 336
Rep: 11 (Unique: 9)
Quote: Originally Posted by ILoveHighDPI View Post

DLSS also automatically comes with the added baggage of RTX. If we see DLSS in a top of the line GTX card? That would have no competition.
For now RTX is probably enough of an anchor to keep the competition close.
Well true, Turing has both RT cores and tensor cores whereas the current GTX lineup has neither so the two currently come as a package, its all or nothing. (Though Volta had tensor cores without RT cores but that is splitting hairs as Volta was not used on any mainstream cards.) In theory Nvidia could bring out a high end GTX lineup with Tensor cores and no RT cores but it is hard to see where that would sit in there lineup or who the target market would be.

Of course the DLSS implementation used in Control did not use the Tensor Cores at all, but that apparently was a one off setup.

EDIT:
Another article - https://www.techradar.com/au/news/do...across-5-games

Last edited by clannagh; 02-27-2020 at 08:47 PM.
clannagh is offline  
post #33 of 49 (permalink) Old 02-28-2020, 12:41 AM
New to Overclock.net
 
Nizzen's Avatar
 
Join Date: Apr 2013
Location: Norway
Posts: 2,019
Rep: 71 (Unique: 57)
Quote: Originally Posted by ZealotKi11er View Post
I got RTX 2080 Ti for DXR and sold it because it was just a tech Demo. I envy people that have fast GPUs and dont run 4K.
Lg 950f 3440x1440 144hz for gaming here. Need faster gpu for higher resolution where Sli isn't working. Waiting for "DP 2.0" for next monitor

CPU24/7) 7980xe - 7900xMOBO: ASUS Rampage Apex x299 / Asrock Taichi x299RAM: 4x 3600c15 / 4x4266c19/ 4x4000 c17 GPU: 2x MSI tri X 2080ti / 1060tiSTORAGE: Areca 1883i HDD R6 + Samsung 850PROs R0 - 3x SM961 nvme 1TB - 3x Intel Optane 900p 480GBCASE: LD cooling V8 R/B / ld cooling reverse R/BMonitor: Acer x34 /Asus 27 AQ 4k g-sync / Asrock x570 Taichi / 3900x
Nizzen is offline  
Sponsored Links
Advertisement
 
post #34 of 49 (permalink) Old 02-28-2020, 04:40 AM
New to Overclock.net
 
Nineball_Seraph's Avatar
 
Join Date: Jan 2018
Posts: 138
Rep: 4 (Unique: 4)
Quote: Originally Posted by TK421 View Post
per game tuning would probably work it developers weren't lazy bastards
I've always felt this is why tech is always so damn slow in gaming. Devs simply are too damn lazy to incorporate. How long did it take for devs to realize that quad core and higher cpus existed or that HT existed before they started taking advantage. Devs seem to only utilize tech once they are forced to rather than be pioneers and implement new tech even if its not needed.

Maybe its because of publishers though, forcing devs to just churn out games without time to really experiment. Who knows.

Heatware: Nineball |

Main Rig: Lian Li Dynamic XL | [email protected] (1.26v) | EVGA XC Ultra Gaming 2080Ti 2160Mhz on stock BIOS | 16 G.Skill Trident RGB @ 4000MHZ | Asus Hero Max X | Heatkiller V CPU block, EKWB Full cover block, Aquacomputer D5 Next, 2x Alphacool nexos rads | Corsair AX1200i

Last edited by Nineball_Seraph; 02-28-2020 at 04:48 AM.
Nineball_Seraph is offline  
post #35 of 49 (permalink) Old 02-28-2020, 10:13 AM
New to Overclock.net
 
DNMock's Avatar
 
Join Date: Jul 2014
Location: Dallas
Posts: 3,738
Rep: 175 (Unique: 129)
Quote: Originally Posted by clannagh View Post
Well true, Turing has both RT cores and tensor cores whereas the current GTX lineup has neither so the two currently come as a package, its all or nothing. (Though Volta had tensor cores without RT cores but that is splitting hairs as Volta was not used on any mainstream cards.) In theory Nvidia could bring out a high end GTX lineup with Tensor cores and no RT cores but it is hard to see where that would sit in there lineup or who the target market would be.

Of course the DLSS implementation used in Control did not use the Tensor Cores at all, but that apparently was a one off setup.

EDIT:
Another article - https://www.techradar.com/au/news/do...across-5-games
It's a pipe dream, but I would dance like a school girl if Nvidia announced that they would be moving all their tensor and RTX cores onto a dedicated separate card al a how physx used to work.

a GPU with nothing but Tensor/RTX cores as a supplemental 2nd GPU for a standard GTX GPU would be amazing


DNMock is offline  
post #36 of 49 (permalink) Old 02-28-2020, 10:15 AM
PC Evangelist
 
ZealotKi11er's Avatar
 
Join Date: May 2007
Location: Toronto, CA
Posts: 46,473
Rep: 1806 (Unique: 1179)
Quote: Originally Posted by DNMock View Post
It's a pipe dream, but I would dance like a school girl if Nvidia announced that they would be moving all their tensor and RTX cores onto a dedicated separate card al a how physx used to work.

a GPU with nothing but Tensor/RTX cores as a supplemental 2nd GPU for a standard GTX GPU would be amazing
Or they can have one die for graphics and one die for tensor/rt cores.

Yamato
(10 items)
Ishimura
(13 items)
CPU
AMD Ryzen 7 3700X
Motherboard
ASUS TUF Gaming X570-Plus (Wi-Fi)
GPU
AMD Radeon RX 5700 XT
RAM
G.SKILL Trident Z RGB (2x8GB) DDR4 3200MHz CL14
Hard Drive
Samsung SM961 512GB
Hard Drive
HGST DeskStar NAS 6TB
Power Supply
EVGA SuperNOVA 750 P2
Cooling
Gamdias Chione M1A-280R
Case
Fractal Design Meshify C TG
Operating System
Microsoft Windows 10 Pro 64 Bit
CPU
Intel Core i7-3770K @ 4.8GHz
Motherboard
ASRock Z77E-ITX
GPU
AMD Radeon Vega Frontier Edition
RAM
AVEXIR Blitz 1.1 16GB DDR3-2400MHz CL10
Hard Drive
SanDisk Ultra II 960GB
Hard Drive
Toshiba X300 5TB
Power Supply
EVGA SuperNOVA 750 G3
Cooling
Corsair H100i GTX
Case
Fractal Design Define Nano S
Operating System
Microsoft Windows 10 Pro 64 Bit
Monitor
LG OLED55C7P
Keyboard
Cooler Master MasterKeys MK750
Mouse
Finalmouse Air58 Ninja
▲ hide details ▲


ZealotKi11er is offline  
post #37 of 49 (permalink) Old 02-28-2020, 02:14 PM
Overclocker
 
JackCY's Avatar
 
Join Date: Jun 2014
Posts: 10,932
Rep: 361 (Unique: 258)
That article is almost 2 weeks old now and DLSS isn't a turn it on and it just works thing yet. Each game has a different implementation/support of it from NV and only the large couple newest games actually have the somewhat worth using DLSS variant. Want to turn it on for youe favorite game? No chance yet. The support is minimal where even older DLSS supported games are not going to be updated to latest DLSS variants, no in driver toggle yet offered, no 2x mode either to use it as AA without upscaling, they haven't delivered that mode since launch.
All in all they are reworking the whole thing multiple times with different models and even shader implementation to get it anywhere usable compared to the initial solution that flopped hard.

Yes the latest variant looks decent if one really can't run native resolution and refuses to reduce individual settings.

I don't even wanna know how much latency there would be if all these features were on a different die let alone a different card. It's already bad enough.
JackCY is offline  
post #38 of 49 (permalink) Old 02-28-2020, 03:14 PM
New to Overclock.net
 
DNMock's Avatar
 
Join Date: Jul 2014
Location: Dallas
Posts: 3,738
Rep: 175 (Unique: 129)
Quote: Originally Posted by JackCY View Post
That article is almost 2 weeks old now and DLSS isn't a turn it on and it just works thing yet. Each game has a different implementation/support of it from NV and only the large couple newest games actually have the somewhat worth using DLSS variant. Want to turn it on for youe favorite game? No chance yet. The support is minimal where even older DLSS supported games are not going to be updated to latest DLSS variants, no in driver toggle yet offered, no 2x mode either to use it as AA without upscaling, they haven't delivered that mode since launch.
All in all they are reworking the whole thing multiple times with different models and even shader implementation to get it anywhere usable compared to the initial solution that flopped hard.

Yes the latest variant looks decent if one really can't run native resolution and refuses to reduce individual settings.

I don't even wanna know how much latency there would be if all these features were on a different die let alone a different card. It's already bad enough.
if it's between higher latency at 60 fps @ 4k and lower latency at 20 fps @ 4k, give me the higher latency 60 fps.


DNMock is offline  
post #39 of 49 (permalink) Old 02-28-2020, 11:04 PM
New to Overclock.net
 
ILoveHighDPI's Avatar
 
Join Date: Oct 2011
Posts: 3,528
Rep: 138 (Unique: 88)
Quote: Originally Posted by DNMock View Post
The only problem with that is that AA itself is on it's last legs since the need for AA goes down as resolution increases. For me at least at 4k I see either little to no improvement or a decrease in quality with the image getting fuzzier.
There definitely are a lot of crappy AA methods out there.
Really you need about 300 Pixels Per Degree before Jaggies start to blur out, at a 27" screen size that still means 8K or higher resolution for the average person at the average view distance.
Yes it is a scale of diminishing returns, basically your cost vs. sharpness ratio is linear up to 1080p (60PPD) and then the relative cost of sharper imagery starts to rise. I'd say the cost to benefit of 4K is still practically the same as 1080p, but going to 8K is definitely a sharp drop in returns.
8K still isn't "No Returns" for your effort but it's getting close enough that I doubt anyone other than IMAX should legitimately consider going ahead to something like 16K.

The good news is that Variable Rate Shading should start to give us localized Super Sampling everywhere that it matters. Ideally it would work practically the same as MSAA, just now compatible in modern game engines.
If it's possible to send the full pixel mapping to the display with VRS, then your 4K+AA becomes Native 8K imagery.
If in the near future we see mass adoption of variable resolution to sharpen high contrast lines it should mean that 8K could be utilized at zero added cost.
ILoveHighDPI is offline  
post #40 of 49 (permalink) Old 02-29-2020, 01:10 PM
New to Overclock.net
 
8051's Avatar
 
Join Date: Apr 2014
Posts: 3,589
Rep: 30 (Unique: 21)
Quote: Originally Posted by Nineball_Seraph View Post
I've always felt this is why tech is always so damn slow in gaming. Devs simply are too damn lazy to incorporate. How long did it take for devs to realize that quad core and higher cpus existed or that HT existed before they started taking advantage. Devs seem to only utilize tech once they are forced to rather than be pioneers and implement new tech even if its not needed.

Maybe its because of publishers though, forcing devs to just churn out games without time to really experiment. Who knows.
I used to know a guy who worked as a games developer, 7 day work weeks during crunch time were the norm. I've worked w/the linux C pthreads library and threaded programming is a PITA.

The "churn out games without time to experiment" sounds like a good reason why hyperthreading and SMT haven't been widely implemented. It seems like the pressure to shove games out the door has increased because SLI and Crossfire have all but died. It seemed like games developers were more willing to implement Crossfire and SLI in the past. Maybe the consolofication of the video game industry has something to do w/the death of SLI/Crossfire.
8051 is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off