Overclock.net › Forums › Industry News › Video Game News › [computerbase.de] DOOM + Vulkan Benchmarked.
New Posts  All Forums:Forum Nav:

[computerbase.de] DOOM + Vulkan Benchmarked. - Page 36

post #351 of 632
Quote:
Originally Posted by Defoler View Post

It could also be that AMD paid ID to specifically tailor doom for vulkan to fit better on GCN than pascal, reasons why you see such a gain in vulkan, while 3dmark shows only a small increase.
After all, aots, hitman, rotr, and others, have been heavily AMD sponsored games. Especially since doom vulkan was also co-announced with AMD claiming that only AMD can do async at all.
Again it is a Nvidia sponsored game stop with that BS, which doesn't support async on nvidia, and how about async driver for maxwell ? they enabled it for pascal in a useless benchmark but not on actual games. it's time to ask nvidia some questions instead of spreading BS. rolleyes.gif

Where is nvidia's army of engineers ? 1 year and still no async on maxwell ? even on so called "unbiased" 3dmark benchmark it's not enabled. i think they switched sides or something biggrin.gif
Edited by EightDee8D - 7/16/16 at 9:53pm
post #352 of 632
Quote:
Originally Posted by Mahigan View Post

Well as Kollock explained... this is outside driver or hardware control. This is OS based. DX12 requires these fences in order for Asynchronous Compute + Graphics to operate.
Here is an example of Asynchronous Compute + Graphics code provided by Microsoft...
Source https://msdn.microsoft.com/en-us/library/windows/desktop/dn899217(v=vs.85).aspx

Even if the driver does not have Asynchronous Compute + Graphics support and even if the feature is disabled in the driver... the game code executes the Fence/Synchronization point at the API (OS) level before the driver even has a say in the matter.

would of been nice if amd released their 1070 / 1080 counterparts instead of holding off. would gladly send back my 1080 for it.
strix
(14 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 8700k Gigabyte Z370 Aorus Gaming 7 XFX Vega 56 Corsair Vegeance LED 16gb ddr4 3000 
Hard DriveHard DriveHard DriveCooling
2x Seagate Ironwolf 4TB 2x 512GB Samsung 850 Pro 1x 512GB Samsung 950 Pro Cooler Master MasterAir Pro 4  
OSMonitorKeyboardPower
Windows 10 Education Asus MG279Q Ducky One TKL RGB Evga Supernova G3 750 
CaseMouse
Cooler Master Haf XB Evo Logitech 403 
  hide details  
Reply
strix
(14 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 8700k Gigabyte Z370 Aorus Gaming 7 XFX Vega 56 Corsair Vegeance LED 16gb ddr4 3000 
Hard DriveHard DriveHard DriveCooling
2x Seagate Ironwolf 4TB 2x 512GB Samsung 850 Pro 1x 512GB Samsung 950 Pro Cooler Master MasterAir Pro 4  
OSMonitorKeyboardPower
Windows 10 Education Asus MG279Q Ducky One TKL RGB Evga Supernova G3 750 
CaseMouse
Cooler Master Haf XB Evo Logitech 403 
  hide details  
Reply
post #353 of 632
Quote:
Originally Posted by EightDee8D View Post

Again it is a Nvidia sponsored game stop with that BS, which doesn't support async on nvidia, and how about async driver for maxwell ? they enabled it for pascal in a useless benchmark but not on actual games. it's time to ask nvidia some questions instead of spreading BS. rolleyes.gif

Where is nvidia's army of engineers ? 1 year and still no async on maxwell ? even on so called "unbiased" 3dmark benchmark it's not enabled. i think they switched sides or something biggrin.gif

Crystal dynamics have been the ones developing tressfx with AMD and the game before Square Enix even had dealing with nvidia. The game was AMD's main showcase of tressfx tech.
And the deal with nvidia was about streaming support and the new GFE features as well as adding to the GPU sales, not necessarily for development.
So calling about spreading BS... you should take your own advice sometimes rolleyes.gif

And I guess it is now a "useless benchmark" because nvidia shows it can run async there? If a game showed it but a benchmark didn't, than nvidia must have payed the developers right?
Of course it can't run in a game if the developer disable it deliberately.

Where are the nvidia engineers? According to ID, they are working with them only now. Why hadn't they work with them before? Maybe because ID were working with AMD and AMD didn't want to include nvidia in it?
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
post #354 of 632
Quote:
Originally Posted by Defoler View Post

Crystal dynamics blablabla my maxwell is obsolete mimimi


Should i also post 1080 launch stream where they showed 1080 running doom on vulkan ? or you will stop spreading bs ? a dev who has worked with amd in past doesn't mean they won't get sponsored by nvidia.

And if 3dmark can run async on pascal, why can't on maxwell ? who is stopping them to enable it on driver ? oh nothing for conspiracy now ? awww

And Aots was a game, but called out as a benchmark but nothing, 3dmark is a freaking benchmark, why does it count now ? because hypocrisy ? .lol
post #355 of 632
Quote:
Originally Posted by Mahigan View Post

Actually.. 3DMArk are wrong and we should all know this by now from our interactions with Kollock. Async Compute + Graphics is not enabled in the nVIDIA driver for their Maxwell cards... yet if you enable it in AotS... what happens? You get a performance loss attributed to the Synchronization points implemented in the AotS path. Try it yourself. This is because AotS is making use of Asynchronous Compute + Graphics which is both parallel and concurrent. Each time a parallel workload is requested... there is a synch point between the Graphics and Compute contexts involved in that workload.

Since we do not see this loss in performance for a GTX 980 Ti under 3DMark Time spy then obviously it is not doing Parallel executions because if it were there would be Synch points adverly affecting Maxwell performance whn Async is turned on (even if the driver does not support the feature).

This was my argument. It stands.

There are two other possibilities one is if the Graphics and Compute workloads were specifically tailored to take the same amount of time to complete and this would make 3DMark unreliable as a Gaming performance metric tool due to the fact that games do not behave in this manner.

The other is if a special Maxwell path was created within the benchmark which did not include the synch points but this would be misleading as the benchmark allows you to "turn on" asynch compute for the Maxwell GPUs. In any other game in which we can turn on Asynch compute or disable it... Maxwell loses performance due to these synch points. Kollock had a post explaining it all.

seems obvious that Futuremark are PBN and its now a worthless test of dx12.
Thyslexia
(16 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 1600 ga B350 gaming 3 Nitro sapphire 390 16gb corsair lpx 2933mhz 
Hard DriveOptical DriveCoolingOS
samsung 850 pro  sony water w10 64bit Pro 
MonitorKeyboardPowerCase
acer xg270hu 1440p 144hz! steelseries 6gv2 Corsair 750w Cm 840 
MouseMouse PadAudioAudio
corsair deathadder (G400 cable went bad) sennheiser HD 600 asus essence stx 
  hide details  
Reply
Thyslexia
(16 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 1600 ga B350 gaming 3 Nitro sapphire 390 16gb corsair lpx 2933mhz 
Hard DriveOptical DriveCoolingOS
samsung 850 pro  sony water w10 64bit Pro 
MonitorKeyboardPowerCase
acer xg270hu 1440p 144hz! steelseries 6gv2 Corsair 750w Cm 840 
MouseMouse PadAudioAudio
corsair deathadder (G400 cable went bad) sennheiser HD 600 asus essence stx 
  hide details  
Reply
post #356 of 632
Quote:
Originally Posted by Mahigan View Post

It does not mean that 3D Mark cheated. Some people love to take what I say out of context and that includes AMD fans. What it means is that 3D Mark chose nVIDIAs implementation over AMDs. This will likely not be the case for games because games are using AMDs implementation for the consoles.

and confirmed, PBN. thumb.gif
(PBN (Paid By Nvidia)

A test if it uses the wrong suit for the real world application (Games) then the test is worthless to show what the game itself is doing with the same said application aka dx12.

The only choice Futuremark have now is to remove time spy.
Thyslexia
(16 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 1600 ga B350 gaming 3 Nitro sapphire 390 16gb corsair lpx 2933mhz 
Hard DriveOptical DriveCoolingOS
samsung 850 pro  sony water w10 64bit Pro 
MonitorKeyboardPowerCase
acer xg270hu 1440p 144hz! steelseries 6gv2 Corsair 750w Cm 840 
MouseMouse PadAudioAudio
corsair deathadder (G400 cable went bad) sennheiser HD 600 asus essence stx 
  hide details  
Reply
Thyslexia
(16 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 1600 ga B350 gaming 3 Nitro sapphire 390 16gb corsair lpx 2933mhz 
Hard DriveOptical DriveCoolingOS
samsung 850 pro  sony water w10 64bit Pro 
MonitorKeyboardPowerCase
acer xg270hu 1440p 144hz! steelseries 6gv2 Corsair 750w Cm 840 
MouseMouse PadAudioAudio
corsair deathadder (G400 cable went bad) sennheiser HD 600 asus essence stx 
  hide details  
Reply
post #357 of 632
so here is the trick question

how diffucult is to use nvidia way and amd way on games?(cause rest assure nvidia will pay a lot to use their way)
post #358 of 632
Quote:
Originally Posted by airfathaaaaa View Post

so here is the trick question

how diffucult is to use nvidia way and amd way on games?(cause rest assure nvidia will pay a lot to use their way)


making it more parallel is probably slightly more difficult, the only problem is if its on the console(which most of the DX12/Vulcan games are), the devs already know how to implement it(or else the consoles are going to get a really bad performance day e.g Lichdom Battlemage on console, an example of making a game on PC, the doing a poor console port), so complication isn't as big of a factor.
Obligatory Build
(13 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-3770K Asus P8Z77-M Pro Sapphire Reference 290 w/ Aquacomputer Kryogeni... Samsung Low Voltage 
Hard DriveHard DriveCoolingMonitor
Western Digital Caviar Blue PNY XLR8 Swiftech H220 + Swiftech 120mm Rad Acer P216HL Black 
KeyboardPowerCaseMouse
Ducky Legend Rosewill Hive 550w Silverstone PS07B-W Mionix Avior 7000 
Mouse Pad
Custom Artscow Mousepad 
  hide details  
Reply
Obligatory Build
(13 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-3770K Asus P8Z77-M Pro Sapphire Reference 290 w/ Aquacomputer Kryogeni... Samsung Low Voltage 
Hard DriveHard DriveCoolingMonitor
Western Digital Caviar Blue PNY XLR8 Swiftech H220 + Swiftech 120mm Rad Acer P216HL Black 
KeyboardPowerCaseMouse
Ducky Legend Rosewill Hive 550w Silverstone PS07B-W Mionix Avior 7000 
Mouse Pad
Custom Artscow Mousepad 
  hide details  
Reply
post #359 of 632
Quote:
Originally Posted by Mahigan View Post

This will likely not be the case for games because games are using AMDs implementation for the consoles.

Nvidia has tons of money. Make no mistake they are going to do everything to ensure their win. Like in case of Crapworks. biggrin.gif
post #360 of 632
Quote:
Originally Posted by ZealotKi11er View Post

The way I see it in DX11 3Dmark AMD was not suffering from CPU overhead. The only difference AMD has in DX12 from Nvidia is ASync advantage. That makes up the difference of 980 Ti and Fury X. The only anomaly is the Pascal cards.
It is a synthethic benchmark it is quite a bit different to Dx11 games, even games which have overhead dont show it on all levels/maps, and some DX11 games dont have issues with DX11 draw calls limit at all
Wanted: [WTB] GPU upgrade
$210 (USD) or best offer
  
Reply
Wanted: [WTB] GPU upgrade
$210 (USD) or best offer
  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Game News
Overclock.net › Forums › Industry News › Video Game News › [computerbase.de] DOOM + Vulkan Benchmarked.