Overclock.net › Forums › Industry News › Video Game News › [computerbase.de] DOOM + Vulkan Benchmarked.
New Posts  All Forums:Forum Nav:

[computerbase.de] DOOM + Vulkan Benchmarked. - Page 45

post #441 of 632
Quote:
Originally Posted by Remij View Post

Warning: Spoiler! (Click to show)
There is no Test A and Test B though, and there doesn't need to be. This isn't a image quality affecting optimization. There's Async = On, which benefits AMD. There's Async = Disabled on Nvidia, which still benefits AMD, and benefits Nvidia Maxwell GPUs.

I doubt many game in the future will come with an Async on/off option because it's just something you'd naturally want to use on AMD, and something that Nvidia will disable in drivers to maintain better performance on their older hardware.
I might catch hell for this, but I'm a fan of driver cheats. I'm a fan of anything they can do to improve performance as long as Image Quality isn't affected. tongue.gif
http://www.dsogaming.com/news/oxide-developer-nvidia-was-putting-pressure-on-us-to-disable-certain-settings-in-the-benchmark/
Edited by PontiacGTX - 7/17/16 at 4:36pm
Wanted: [WTB] GPU upgrade
$210 (USD) or best offer
  
Reply
Wanted: [WTB] GPU upgrade
$210 (USD) or best offer
  
Reply
post #442 of 632
Quote:
Originally Posted by Remij View Post

There is no Test A and Test B though, and there doesn't need to be. This isn't a image quality affecting optimization. There's Async = On, which benefits AMD. There's Async = Disabled, which still benefits AMD, and benefits Nvidia Maxwell GPUs.

I doubt many game in the future will come with an Async on/off option because it's just something you'd naturally want to use on AMD, and something that Nvidia will disable in drivers to maintain better performance on their older hardware.

I might catch hell for this, but I'm a fan of driver cheats. I'm a fan of anything they can do to improve performance as long as Image Quality isn't affected. tongue.gif

 

It's benchmark , we talk about Valid Result.People are using this benchmark for their logic.how can I trust ?

Read this post.

post #443 of 632
Quote:
Originally Posted by comprodigy View Post

The test in question actually does do a good bit of async. It was altered by AMD to use a good amount. You say in games we see much different, but so far in games, that hasnt been true really. In fact, the difference in ashes between on and off isnt much at all. So where exactly are you going with this? Are you saying that its impossible for async to be turned off in driver by Nvidia? That no matter what you're going to have async running on Maxwell if the programmer doesnt explicitly disable it in code? Thats easy enough to prove. Since AMDs own demonstration of async isnt enough, do you care to write your own program? I'll gladly test.

Maxwell drops in performance once Async Compute + Graphics is enabled under AotS as seen here...


In Time Spy.. we see this odd behavior whereas the performance stays the same..




That is the point of contention here.
Edited by Mahigan - 7/17/16 at 4:40pm
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
post #444 of 632
Yep, no sense in enabling things that adversely affect performance and make no difference to image quality.


Speaking of that though, what's the end result of the whole 'Nvidia doesn't render the terrain properly' in Ashes? Did they fix it or something, because completely maxed out, they're looking exactly the same these days.
My main PC
(8 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700k Asus ROG Maximus VIII Gene Nvidia GTX 1080Ti G.Skill Ripjaws 
Hard DriveOSKeyboardPower
Samsung 850 EVO  Windows 10 Razer Blackwidow Chroma EVGA Supernova 1300w 
  hide details  
Reply
My main PC
(8 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700k Asus ROG Maximus VIII Gene Nvidia GTX 1080Ti G.Skill Ripjaws 
Hard DriveOSKeyboardPower
Samsung 850 EVO  Windows 10 Razer Blackwidow Chroma EVGA Supernova 1300w 
  hide details  
Reply
post #445 of 632
Quote:
Originally Posted by GorillaSceptre View Post

Put the reason in my edit. One word - Refunds. biggrin.gif

It helps level the playing field, but it's by no means perfect. I mean a GPU as long as you don't buy from a certain vendor *cough*Newegg*cough* you could always return for a full refund within 14 or 30 days.

To get a Steam refund you have 2 hours to try out the game. I mean who's to say devs won't make it so that the first 3 hours of the game are excellent and bug free, but after that it's just a steaming pile of dog turd on the fast track to trainwreck town.

And yes I'm extremely cynical (as if that wasn't obvious).
Edited by magnek - 7/17/16 at 4:49pm
post #446 of 632
Quote:
Originally Posted by Remij View Post

Yep, no sense in enabling things that adversely affect performance and make no difference to image quality.


Speaking of that though, what's the end result of the whole 'Nvidia doesn't render the terrain properly' in Ashes? Did they fix it or something, because completely maxed out, they're looking exactly the same these days.

Talking about image quality.. This has been popping off around the web.


Maybe we should take a closer look at what type of image quality differences there are in Time Spy. Nvidias drivers have some tricks up their sleeves. redface.gif
post #447 of 632
Quote:
Originally Posted by Remij View Post

Yep, no sense in enabling things that adversely affect performance and make no difference to image quality.


Speaking of that though, what's the end result of the whole 'Nvidia doesn't render the terrain properly' in Ashes? Did they fix it or something, because completely maxed out, they're looking exactly the same these days.
I think that issue should have been fixed the game was patched recently and it was happening only on Nvidia
Wanted: [WTB] GPU upgrade
$210 (USD) or best offer
  
Reply
Wanted: [WTB] GPU upgrade
$210 (USD) or best offer
  
Reply
post #448 of 632
Holy, and I just got a GTX 1070 instead of a 300$ Fury.

It's also impressive that a RX 480 at 232mm2 is reaching the same level of performance than Pascal 1070 at 312mm2.

Guess I'll keep it until Battlefield 1 comes out with DirectX 12.

The Doom thingy is also unnerving, I'm really starting to doubt of my buy, hope this new driver fixes it.
Edited by Dargonplay - 7/17/16 at 5:01pm
post #449 of 632
I read it.

At the end of the day, I understand it completely. I just don't think that one is invalid over the other because they are still doing the same amount of work, whether it's done consecutively or concurrently.

However, I honestly wish that all this async business was entirely invisible to the consumer. I wish it was just 'hey look, we're really well optimized and handle parallel workloads, our performance is better than beforet' and on the other hand, 'hey look, we're really super fast at performing serial workloads, our performance is as good as it's ever been' and the drivers would just do what they do to optimize performance. tongue.gif
My main PC
(8 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700k Asus ROG Maximus VIII Gene Nvidia GTX 1080Ti G.Skill Ripjaws 
Hard DriveOSKeyboardPower
Samsung 850 EVO  Windows 10 Razer Blackwidow Chroma EVGA Supernova 1300w 
  hide details  
Reply
My main PC
(8 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700k Asus ROG Maximus VIII Gene Nvidia GTX 1080Ti G.Skill Ripjaws 
Hard DriveOSKeyboardPower
Samsung 850 EVO  Windows 10 Razer Blackwidow Chroma EVGA Supernova 1300w 
  hide details  
Reply
post #450 of 632
Quote:
Originally Posted by Kpjoslee View Post

That would be a major dilemma. Last thing benchmark program should do is create separate optimized path for each GPU.

This is exactly the dilemma I expected from them. There's no way to have a single render path in a DX12 benchmark without optimizing it for the lowest common denominator and punishing the silicon with extra features.

"Impartial" benchmarking has become an oxymoron with DX12. You have to optimize for each vendor or you're unfairly punishing one of them. It just about makes the whole concept of "benchmark" meaningless.

They had no problem doing this with tessellation. Now suddenly they've got morals?
Edited by infranoia - 7/17/16 at 5:11pm
Parasite
(18 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770K @ 4.7GHz Z87 MPOWER (MS-7818) Sapphire Radeon 290x @1100/1500 EVGA 1080Ti SC2 Hybrid 
RAMHard DriveHard DriveCooling
G.SKILL 2133 Samsung 850 Pro Caviar Black Corsair H100 
CoolingCoolingOSMonitor
Corsair HG10 Corsair H60 Windows 7 x64 Sony XBR65X850B 
KeyboardPowerCaseMouse
CMSTORM Quickfire XT Corsair AX1200i Antec P280 Logitec G700 
Mouse PadAudio
Black, came with my NeXTcube 25 years ago. Sound Blaster Recon 3D PCIe 
  hide details  
Reply
Parasite
(18 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770K @ 4.7GHz Z87 MPOWER (MS-7818) Sapphire Radeon 290x @1100/1500 EVGA 1080Ti SC2 Hybrid 
RAMHard DriveHard DriveCooling
G.SKILL 2133 Samsung 850 Pro Caviar Black Corsair H100 
CoolingCoolingOSMonitor
Corsair HG10 Corsair H60 Windows 7 x64 Sony XBR65X850B 
KeyboardPowerCaseMouse
CMSTORM Quickfire XT Corsair AX1200i Antec P280 Logitec G700 
Mouse PadAudio
Black, came with my NeXTcube 25 years ago. Sound Blaster Recon 3D PCIe 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Game News
Overclock.net › Forums › Industry News › Video Game News › [computerbase.de] DOOM + Vulkan Benchmarked.