[Digital Trends] Nvidia says it’s better than AMD for low-lag gaming, and has the data to prove it - Page 3 - Overclock.net - An Overclocking Community

Forum Jump: 

[Digital Trends] Nvidia says it’s better than AMD for low-lag gaming, and has the data to prove it

Reply
 
Thread Tools
post #21 of 87 (permalink) Old 08-24-2019, 06:50 AM
mfw
 
ToTheSun!'s Avatar
 
Join Date: Jul 2011
Location: Terra
Posts: 6,896
Rep: 382 (Unique: 200)
Quote: Originally Posted by Omega X View Post
Interesting. The "market leader " is very bothered by AMD. They must be losing marketshare again.
Did the fact that AMD is more competitive now, on all fronts, than it's ever been in the last 10 years make you conclude that?

CPU
Intel 6700K
Motherboard
Asus Z170i
GPU
MSI 2080 Sea Hawk X
RAM
G.skill Trident Z 3200CL14 8+8
Hard Drive
Samsung 850 EVO 1TB
Hard Drive
Crucial M4 256GB
Power Supply
Corsair SF600
Cooling
Noctua NH C14S
Case
Fractal Design Core 500
Operating System
Windows 10 Education
Monitor
ViewSonic XG2703-GS
Keyboard
Ducky One 2 Mini
Mouse
Glorious Odin
Mousepad
Asus Scabbard
Audio
Fiio E17K v1.0 + Beyerdynamic DT 1990 PRO (B pads)
▲ hide details ▲
ToTheSun! is offline  
Sponsored Links
Advertisement
 
post #22 of 87 (permalink) Old 08-24-2019, 07:29 AM
New to Overclock.net
 
umeng2002's Avatar
 
Join Date: Jul 2010
Location: Florida
Posts: 3,129
Rep: 170 (Unique: 106)
History will call the previous few years since 2012 to like 2017 or 2018 "The Great nVidia™ Intel™ Innovation Dearth."

Kind of like Winter in Game of Thrones.

CPU
AMD Ryzen 2700X
Motherboard
Asus Prime X470-Pro
GPU
EVGA GeForce RTX 2070 XC Ultra
RAM
TeamGroup T-Force 16 GB (2x8) Pro Dark (B-die TDPGD416G3200HC14ADC01)
Hard Drive
Samsung 840 EVO 250 GB
Power Supply
Seasonic Focus Plus Platinum SSR-750PX
Cooling
Corsair H80i (not V2 or GT)
Monitor
LG 34UC80-B
Keyboard
Logitech G413
Mouse
Logitech G503 RGB
Audio
Creative SoundBlaster Z (OEM)
▲ hide details ▲
umeng2002 is offline  
post #23 of 87 (permalink) Old 08-24-2019, 08:09 AM
Just another nerd
 
Sheyster's Avatar
 
Join Date: Aug 2009
Location: So-Cal, U.S.A
Posts: 5,343
Rep: 336 (Unique: 241)
Quote: Originally Posted by ZealotKi11er View Post
Improve for every gamer or per game bases? I tried it on Anthem and it was beyond garbage. If DLSS is not ready for AAA day 1 its pointless even if 3-4 months it might get to a decent looking level.
I personally think the AI approach is the wrong approach. nVidia should put resources into something that works like Dolby Vision, intelligently making image tweaks on the fly as the game is played and the environment changes, as opposed to something like DLSS.

CPU: intel Core i9 9900K - 5 GHz all core HT on - 1.26v - Air cooled
MOBO: Gigabyte Z390 Aorus Pro
GPU: MSI Duke OC RTX 2080 Ti - 380w PL BIOS
RAM: G.Skill 32GB DDR4 3200 CL14 @ 3600 CL15 (Samsung B-die)
SSD: Samsung 970 EVO 1TB M.2 NVMe
PSU: EVGA SuperNOVA 1000 G3
CASE: "Open bench" - Stripped down Tt Core X9
Sheyster is online now  
Sponsored Links
Advertisement
 
post #24 of 87 (permalink) Old 08-24-2019, 08:45 AM
New to Overclock.net
 
EniGma1987's Avatar
 
Join Date: Sep 2011
Posts: 6,283
Rep: 337 (Unique: 247)
Quote: Originally Posted by knightriot View Post
lol i using 2080ti underwater + oc . In bf5 , raytracing is trash, and dlss trash too , blur as ****
You guy can see it , i upload images with RTX OFF and RTX ON+DLSS after:
Your second screenshot with RTX on doesnt show anything using RTX though. Battlefield has removed more and more ray tracing to make the game playable since launch and only a handful of the windows and water in a map get any RTX on them anymore, thats it.



Quote: Originally Posted by Sheyster View Post
I personally think the AI approach is the wrong approach. nVidia should put resources into something that works like Dolby Vision, intelligently making image tweaks on the fly as the game is played and the environment changes, as opposed to something like DLSS.
Dolby Vision requires knowledge of the end display device capabilities to make such changes. That is why hardware chips are required in the end devices to report such info.

EniGma1987 is offline  
post #25 of 87 (permalink) Old 08-24-2019, 09:05 AM
Just another nerd
 
Sheyster's Avatar
 
Join Date: Aug 2009
Location: So-Cal, U.S.A
Posts: 5,343
Rep: 336 (Unique: 241)
Quote: Originally Posted by EniGma1987 View Post
Dolby Vision requires knowledge of the end display device capabilities to make such changes. That is why hardware chips are required in the end devices to report such info.
Could be part of the spec for G-Sync 3. All the displays are pre-"certified" and a chip could be added to the module to handle the transitions. Of course they'd probably raise the price +500$.

CPU: intel Core i9 9900K - 5 GHz all core HT on - 1.26v - Air cooled
MOBO: Gigabyte Z390 Aorus Pro
GPU: MSI Duke OC RTX 2080 Ti - 380w PL BIOS
RAM: G.Skill 32GB DDR4 3200 CL14 @ 3600 CL15 (Samsung B-die)
SSD: Samsung 970 EVO 1TB M.2 NVMe
PSU: EVGA SuperNOVA 1000 G3
CASE: "Open bench" - Stripped down Tt Core X9
Sheyster is online now  
post #26 of 87 (permalink) Old 08-24-2019, 09:38 AM
New to Overclock.net
 
knightriot's Avatar
 
Join Date: Jul 2016
Posts: 78
Rep: 5 (Unique: 4)
Quote: Originally Posted by EniGma1987 View Post
Your second screenshot with RTX on doesnt show anything using RTX though. Battlefield has removed more and more ray tracing to make the game playable since launch and only a handful of the windows and water in a map get any RTX on them anymore, thats it.





Dolby Vision requires knowledge of the end display device capabilities to make such changes. That is why hardware chips are required in the end devices to report such info.
this is rtx+dlss, i'm using 4k monitor , rtx unplayable at 4k, so i turn on dlss

3900X +XSPC raystorm nero + EK PE 480 + XSPC 480 WHITE + XSPC D5 + Enthoo Primo
CRosshair VIII formula + Corsair [email protected] 3600c14 +
2080TI +HEATKILLER IV 380W + DELL UP316Q + Onkyo SE-U55sX + FSP 1KW
sm961 1tb + Intel 600P 1TB +Crucial MX300 750GB + WD BLACK 2TB
knightriot is offline  
post #27 of 87 (permalink) Old 08-24-2019, 10:11 AM
Overclocker
 
JackCY's Avatar
 
Join Date: Jun 2014
Posts: 9,902
Rep: 337 (Unique: 240)
Quote:
Adds Beta support for GPU Integer Scaling
Adds Beta support for Ultra-Low Latency Mode
Adds support for new Freestyle Sharpen Filter
Adds support for new G-SYNC compatible monitors
No thanks. I will wait for the non beta studio driver. All they did is game driver is now a Beta for people to test because they aren't willing to spend a little on hiring people to do testing for them. What's next, OS kernel beta versions? But I guess we've already been there too at least with Linux.

NV's so fierce against AMD it's getting ridiculous. Copying and replying to every single tiny feature or improvement.
JackCY is offline  
post #28 of 87 (permalink) Old 08-24-2019, 10:58 AM
Nvidia Enthusiast
 
philhalo66's Avatar
 
Join Date: Apr 2009
Location: Maine
Posts: 9,723
Rep: 268 (Unique: 209)
HA! good one nvidia. Sounds like damage control for a lackluster RTX launch to me. maybe not charge 1500 dollars for a card barely faster than your last gen.

gaming rig
(18 items)
CPU
Intel Core i9 9900K
Motherboard
GIGABYTE Z390 AORUS PRO
GPU
EVGA GTX 1080 Ti SC Black Edition.
RAM
G.SKILL Ripjaws V DDR4 3600MHz 32GB
Hard Drive
HGST 2TB
Hard Drive
Seagate 2.5 Inch 2TB
Hard Drive
Samsung 860 EVO 1TB
Hard Drive
Intel 660p NVME 512GB
Optical Drive
LG Blu Ray drive
Power Supply
Corsair RM850i 850W
Cooling
Corsair H115i /W Modded backplate
Case
Corsair C70 Gunmetal
Operating System
windows 10 Pro
Monitor
MSI OPTIX MAG24C
Monitor
Acer XFA240 bmjdpr
Keyboard
G.SKILL RIPJAWS KM780 MX
Mouse
Razer Deathadder Elite
Audio
Sound Blaster Z
▲ hide details ▲
philhalo66 is offline  
post #29 of 87 (permalink) Old 08-24-2019, 11:14 AM - Thread Starter
sudo apt install sl
 
WannaBeOCer's Avatar
 
Join Date: Dec 2009
Posts: 5,362
Rep: 176 (Unique: 123)
Quote: Originally Posted by JackCY View Post
No thanks. I will wait for the non beta studio driver. All they did is game driver is now a Beta for people to test because they aren't willing to spend a little on hiring people to do testing for them. What's next, OS kernel beta versions? But I guess we've already been there too at least with Linux.

NV's so fierce against AMD it's getting ridiculous. Copying and replying to every single tiny feature or improvement.
Yes they're copying AMD's canny marketing that's focused on rebranding everything to make it seem new.

"Game boost"
"Game cache"

In nVidia's driver to reduce input lag before the recent driver it was called "Pre-Rendered Frames" now it's called "Low Latency Mode"

nVidia FreeStyle always had options for sharpening but now they made a filter just for sharpening. We even had comparisons between Radeon Image Sharpening vs FreeStyle sharpening before this new driver was released.

Silent
(20 items)
CPU
Core i9 9900K... CoffeeTime! @ 4.2Ghz w/ 1v
Motherboard
Maximus VIII Formula
GPU
Radeon VII @ 1900Mhz/1250Mhz w/ 1v
RAM
TeamGroup Xtreem 16GB 3866Mhz CL15
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 500GB
Power Supply
EVGA SuperNova 1200w P2
Cooling
EK Supremacy Full Copper Clean
Cooling
XSPC D5 Photon v2
Cooling
Black Ice Gen 2 GTX360 x2
Cooling
EK-Vector Radeon VII - Copper + Plexi
Case
Thermaltake Core X5 Tempered Glass Edition
Operating System
Clear Linux
Monitor
Acer XF270HUA
Keyboard
Cherry MX Board 6.0
Mouse
Logitech G600
Mouse
Alugraphics GamerArt
Audio
Definitive Technology Incline
Audio
SMSL M8A
▲ hide details ▲
WannaBeOCer is offline  
post #30 of 87 (permalink) Old 08-24-2019, 05:10 PM
Overclocker
 
JackCY's Avatar
 
Join Date: Jun 2014
Posts: 9,902
Rep: 337 (Unique: 240)
Yes. I've tried FreeStyle/GFE abomination once. The implementation of those effects such as sharpening was poor. Even CAS from AMD is well not that great at least the port to ReShade, even minimal setting of 0.0 that's supposed to disable the sharpening is still sharper than my lumasharpen settings.
Pre-rendered frames have been around for ages for DX9-11 at least, force to 1 and forget, some apps/games allow changing it but one can try force it via driver. Now they confusingly renamed it to low latency mode and removed the frame steps, making OFF probably application selected? On = 1 and Ultra = JIT (just in time, that's what AMD did, aka 0) = "0". Setting this below 1 is risky and can result in lower performance maybe even stutter depending on how well can they keep being just in time and not late. Not sure if this is even relevant for DX12/Vulkan, maybe for OGL it is.

Sharpening is simple but any decent high quality complex sharpening filter blows up in complexity to run it.

Definitely don't feel like using GFE to have yet another injected app into every application/game when there is ReShade.
JackCY is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off