[Nvidia] DLSS boosts Port Royal Benchmark performance by up to 50% - Page 5 - Overclock.net - An Overclocking Community

Forum Jump: 

[Nvidia] DLSS boosts Port Royal Benchmark performance by up to 50%

Reply
 
Thread Tools
post #41 of 49 (permalink) Old 02-09-2019, 09:31 PM
Otherworlder
 
epic1337's Avatar
 
Join Date: Feb 2011
Posts: 7,070
Rep: 213 (Unique: 121)
yes, the AI is trained in Nvidia's servers to learn and compare between native 4K and upscaled output.
by doing this the upscaled output would be much closer to native 4K in quality.

trolling an adult is very dangerous, don't try it at home nor at work. you don't want to play tag with a rabid man.
epic1337 is online now  
Sponsored Links
Advertisement
 
post #42 of 49 (permalink) Old 02-10-2019, 07:12 AM
mfw
 
ToTheSun!'s Avatar
 
Join Date: Jul 2011
Location: Terra
Posts: 5,998
Rep: 359 (Unique: 188)
Quote: Originally Posted by guitarmageddon88 View Post
So what I'm confused about is what we are exactly training to "learn". Do my results get stored in a server? Is there a DLSS info file somewhere within the program directory that keeps learning? How does it continuously improve and is that data stored with the user?
https://www.reddit.com/r/nvidia/comm..._does_it_work/

What stunned me was that I got a 50% (!) performance increase at 1440p in Port Royal with DLSS, while the graphical result was subjectively better overall. It introduced some artifacts, but the quality of reflections was much better.

Now, this is a best case scenario, of course, but if we can get even just 20-30% performance increases for future games with (quasi) completely raytraced graphics (which 7nm nVidia cards might be able to realize), I really don't see how machine learning, with Tensor Cores and AMD analogues, can ever not be part of graphics cards again.

CPU
Intel 6700K
Motherboard
Asus Z170i
GPU
MSI 2080 Sea Hawk X
RAM
G.skill Trident Z 3200CL14 8+8
Hard Drive
Samsung 850 EVO 1TB
Hard Drive
Crucial M4 256GB
Power Supply
Corsair SF600
Cooling
Noctua NH C14S
Case
Fractal Design Core 500
Operating System
Windows 10 Education
Monitor
ViewSonic XG2703-GS
Keyboard
Cooler Master Quickfire TK
Mouse
Corepadded Logitech G703
Mousepad
Cooler Master MP510
Audio
Fiio E17K v1.0 + Beyerdynamic DT 1990 PRO (B pads)
▲ hide details ▲
ToTheSun! is offline  
post #43 of 49 (permalink) Old 02-10-2019, 03:30 PM
New to Overclock.net
 
guitarmageddon88's Avatar
 
Join Date: May 2010
Location: Probably racing....
Posts: 1,560
Rep: 70 (Unique: 62)
Quote: Originally Posted by ToTheSun! View Post
https://www.reddit.com/r/nvidia/comm..._does_it_work/

What stunned me was that I got a 50% (!) performance increase at 1440p in Port Royal with DLSS, while the graphical result was subjectively better overall. It introduced some artifacts, but the quality of reflections was much better.

Now, this is a best case scenario, of course, but if we can get even just 20-30% performance increases for future games with (quasi) completely raytraced graphics (which 7nm nVidia cards might be able to realize), I really don't see how machine learning, with Tensor Cores and AMD analogues, can ever not be part of graphics cards again.
Yes I agree. You can tell that there was some anti-aliasing happening but it did look Superior in my opinion. Maybe the smoothness of the rendering has to do with that just as much as literal image quality (which some have said is not as good , as in the final fantasy examples)

Sandy-Capable
(14 items)
CPU
i7-8700k
Motherboard
ASUS Maximus X Code
GPU
MSI RTX 2080 Gaming Trio
RAM
GSkill Trident Z Rgb
Hard Drive
970 evo
Power Supply
EVGA Supernova G3
Cooling
H150i Pro
Case
LianLi PC011 Air
Operating System
Windows 10
Monitor
HP Omen 27"
Keyboard
Corsair K70
Mouse
Razer Abyssus
CPU
i7 2600k @ 4.5ghz
Motherboard
ASUS P8Z68 Deluxe Gen 3
GPU
MSI GTX 580 Lightning Xtreme
GPU
MSI GTX 580 Lightning Xtreme
RAM
Gskill Sniper 8gb 1866mHz @ 1.35v
Hard Drive
Samsung 840 pro
Optical Drive
Lite-On
Power Supply
Corsair hx 750
Cooling
rs 240
Case
OBSIDIAN 800D
Operating System
WINDOWS 7 64
Monitor
Alienware aw2310
Keyboard
corsair k70
Audio
Logitech Z906
▲ hide details ▲
guitarmageddon88 is offline  
Sponsored Links
Advertisement
 
post #44 of 49 (permalink) Old 02-10-2019, 03:48 PM
New to Overclock.net
 
senileoldman's Avatar
 
Join Date: Apr 2017
Posts: 287
Rep: 11 (Unique: 11)
Would be interesting to see if the next generation of consoles start using Nvidia hardware, or if AMD puts AI cores on their cards.
senileoldman is offline  
post #45 of 49 (permalink) Old 02-10-2019, 05:19 PM
Graphics Junkie
 
UltraMega's Avatar
 
Join Date: Feb 2017
Posts: 459
Rep: 7 (Unique: 7)
Quote: Originally Posted by epic1337 View Post
erm, you make it sound like community contribution is a waste of resources.
but sure, only Nvidia users that are interested in obscure games would benefit from this, the others could just simply ignore DLSS altogether.
This wouldn't be "community contribution", it would be a free super compute cluster for Nvidia. You might be interested in using your hardware as a node in an Nvidia cluster but it would be a bad move for Nvidia for obvious reasons like the pure desperation of the idea in the first place. Nvidia taking the position of releasing $1200 dollar GPUs and then asking users to help them compile the data needed to properly utilize these GPUs would just be laughable from a marketing stand point.

i7 7700kK @4.2ghz
16GB DDR4 3200mhz
GeForce 1080 Ti
UltraMega is offline  
post #46 of 49 (permalink) Old 02-10-2019, 08:06 PM
PC Evangelist
 
ZealotKi11er's Avatar
 
Join Date: May 2007
Location: Toronto, CA
Posts: 45,306
Rep: 1789 (Unique: 1170)
Quote: Originally Posted by senileoldman View Post
Would be interesting to see if the next generation of consoles start using Nvidia hardware, or if AMD puts AI cores on their cards.
1) Next gen as far as we know uses Navi which does not have DLSS/Ray Tracing.
2) This technology makes zero sense for Consoles. You are allocating die space for this. Nvidia just trying to make use of the idle tensor cores and ray tracing cores.

Ishimura
(13 items)
Yamato
(10 items)
CPU
Intel Core i7-3770K @ 4.8GHz
Motherboard
ASRock Z77E-ITX
GPU
AMD Radeon Vega Frontier Edition
RAM
AVEXIR Blitz 1.1 16GB DDR3-2400MHz CL10
Hard Drive
SanDisk Ultra II 960GB
Hard Drive
Toshiba X300 5TB
Power Supply
EVGA SuperNOVA 750 G3
Cooling
Corsair H100i GTX
Case
Fractal Design Define Nano S
Operating System
Microsoft Windows 10 Pro 64 Bit
Monitor
LG OLED55C7P
Keyboard
Cooler Master MasterKeys MK750
Mouse
Finalmouse Air58 Ninja
CPU
Intel Core i7-6700K @ 4.6GHz
Motherboard
ASUS Z170 Deluxe
GPU
EVGA GeForce GTX 1080 Ti Hybrid
RAM
Corsair Vengeance LPX 16GB DDR4-3000 CL15
Hard Drive
Samsung SM961 512GB
Hard Drive
HGST DeskStar NAS 6TB
Power Supply
EVGA SuperNOVA 750 P2
Cooling
Cooler Master Nepton 280L
Case
Fractal Design Meshify C TG
Operating System
Microsoft Windows 10 Pro 64 Bit
▲ hide details ▲


ZealotKi11er is offline  
post #47 of 49 (permalink) Old 02-11-2019, 12:25 AM
New to Overclock.net
 
ILoveHighDPI's Avatar
 
Join Date: Oct 2011
Posts: 3,146
Rep: 132 (Unique: 84)
If anyone would use "AI Cores" for actually running NPC AI, that sounds awesome, heck yeah I'll pay extra for that. I keep thinking that Tenser Cores have a much brighter future in gaming than RT cores do.
ILoveHighDPI is offline  
post #48 of 49 (permalink) Old 02-11-2019, 01:46 AM
Otherworlder
 
epic1337's Avatar
 
Join Date: Feb 2011
Posts: 7,070
Rep: 213 (Unique: 121)
Quote: Originally Posted by ILoveHighDPI View Post
If anyone would use "AI Cores" for actually running NPC AI, that sounds awesome, heck yeah I'll pay extra for that. I keep thinking that Tenser Cores have a much brighter future in gaming than RT cores do.
or a full blown AI bot.
https://www.engadget.com/2019/01/24/...tion-tlo-mana/

though self-learning features isn't limited to AI, they could be used for map creation, obstacles or unit placements.
e.g. imagine a map or scenario that gets progressively harder, instead of relying on randomized or predefined results.

trolling an adult is very dangerous, don't try it at home nor at work. you don't want to play tag with a rabid man.
epic1337 is online now  
post #49 of 49 (permalink) Old 02-11-2019, 02:12 AM
mfw
 
ToTheSun!'s Avatar
 
Join Date: Jul 2011
Location: Terra
Posts: 5,998
Rep: 359 (Unique: 188)
I wouldn't mind running simulations for nVidia during the winter.

CPU
Intel 6700K
Motherboard
Asus Z170i
GPU
MSI 2080 Sea Hawk X
RAM
G.skill Trident Z 3200CL14 8+8
Hard Drive
Samsung 850 EVO 1TB
Hard Drive
Crucial M4 256GB
Power Supply
Corsair SF600
Cooling
Noctua NH C14S
Case
Fractal Design Core 500
Operating System
Windows 10 Education
Monitor
ViewSonic XG2703-GS
Keyboard
Cooler Master Quickfire TK
Mouse
Corepadded Logitech G703
Mousepad
Cooler Master MP510
Audio
Fiio E17K v1.0 + Beyerdynamic DT 1990 PRO (B pads)
▲ hide details ▲
ToTheSun! is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off