Overclock.net › Forums › Industry News › Software News › [Anand] Ashes of the Singularity Revisited: A Beta Look at DirectX 12 & Asynchronous Shading
New Posts  All Forums:Forum Nav:

[Anand] Ashes of the Singularity Revisited: A Beta Look at DirectX 12 & Asynchronous Shading - Page 41

post #401 of 1175
Quote:
Originally Posted by Defoler View Post

Those two comparisons are unequal. If you actually look at all the numbers and not the final number, you can see that on the nvidia side they also added motherboard consumption, which for some reason is missing from the AMD numbers.
Either they forgot, left out on purpose, but either way, they did not explain why it is not there, or why it is on the nvidia numbers.

Also this is a bit strange, as AMD seems to pull 428.80W from the 300W rated connection during gaming, and nvidia are pulling 358.80W from 225W rated connections. Both of these numbers aren't really making sense, unless on both times they are also adding the 75W from the PCIE lanes, in which case, why double it on the nvidia side as putting it on an extra set under motherboard total?

Motherboard total is just the 3.3V + 12V added together. If you do the math they are the same. With the wattage pulled greater than the rated spec you can look it as this. If you pull 200V with half a amp you get 400W for that instant. To be more clear you just need to pull a lot of voltage in a short period of time to obtain that 400+W. Pull 400W for the first half of the second and then pull 100W for the last half and you get an average of 250W pulled.
post #402 of 1175
Quote:
Originally Posted by Defoler View Post

Those two comparisons are unequal. If you actually look at all the numbers and not the final number, you can see that on the nvidia side they also added motherboard consumption, which for some reason is missing from the AMD numbers.
Either they forgot, left out on purpose, but either way, they did not explain why it is not there, or why it is on the nvidia numbers.

Also this is a bit strange, as AMD seems to pull 428.80W from the 300W rated connection during gaming, and nvidia are pulling 358.80W from 225W rated connections. Both of these numbers aren't really making sense, unless on both times they are also adding the 75W from the PCIE lanes, in which case, why double it on the nvidia side as putting it on an extra set under motherboard total?
To the contrary, 980Ti's review is missing some key charts. I'm expectant of what these results were in relation to Fury X.

The Machine
(14 items)
 
Nexus 7 2013
(11 items)
 
 
CPUMotherboardGraphicsRAM
A10 6800K Asus F2A85-V MSI 6870 Hawx, VTX3D 5770, AMD HD6950(RIP), Sap... G.skill Ripjaws PC12800 6-8-6-24 
Hard DriveOptical DriveOSMonitor
Seagate 7200.5 1TB NEC 3540 Dvd-Rom Windows 7 x32 Ultimate Samsung P2350 23" 1080p 
PowerCaseMouseAudio
Seasonic s12-600w CoolerMaster Centurion 5 Logitech G600 Auzen X-Fi Raider 
CPUMotherboardGraphicsRAM
Quad Krait 300 at 1.5Ghz Qualcomm APQ8064-1AA SOC Adreno 320 at 400mhz 2GB DDR3L-1600 
Hard DriveOSMonitorKeyboard
32GB Internal NAND Android 5.0 7" 1920X1200 103% sRGB & 572 cd/m2 LTPS IPS Microsoft Wedge Mobile Keyboard 
PowerAudio
3950mAh/15.01mAh Battery Stereo Speakers 
  hide details  
Reply
The Machine
(14 items)
 
Nexus 7 2013
(11 items)
 
 
CPUMotherboardGraphicsRAM
A10 6800K Asus F2A85-V MSI 6870 Hawx, VTX3D 5770, AMD HD6950(RIP), Sap... G.skill Ripjaws PC12800 6-8-6-24 
Hard DriveOptical DriveOSMonitor
Seagate 7200.5 1TB NEC 3540 Dvd-Rom Windows 7 x32 Ultimate Samsung P2350 23" 1080p 
PowerCaseMouseAudio
Seasonic s12-600w CoolerMaster Centurion 5 Logitech G600 Auzen X-Fi Raider 
CPUMotherboardGraphicsRAM
Quad Krait 300 at 1.5Ghz Qualcomm APQ8064-1AA SOC Adreno 320 at 400mhz 2GB DDR3L-1600 
Hard DriveOSMonitorKeyboard
32GB Internal NAND Android 5.0 7" 1920X1200 103% sRGB & 572 cd/m2 LTPS IPS Microsoft Wedge Mobile Keyboard 
PowerAudio
3950mAh/15.01mAh Battery Stereo Speakers 
  hide details  
Reply
post #403 of 1175
Quote:
Originally Posted by Defoler View Post

Those two comparisons are unequal. If you actually look at all the numbers and not the final number, you can see that on the nvidia side they also added motherboard consumption, which for some reason is missing from the AMD numbers.
Either they forgot, left out on purpose, but either way, they did not explain why it is not there, or why it is on the nvidia numbers.

Also this is a bit strange, as AMD seems to pull 428.80W from the 300W rated connection during gaming, and nvidia are pulling 358.80W from 225W rated connections. Both of these numbers aren't really making sense, unless on both times they are also adding the 75W from the PCIE lanes, in which case, why double it on the nvidia side as putting it on an extra set under motherboard total?

I tore this myth apart over at HardOCP.

Take this review of the Fury against a GTX 980:
http://m.hardocp.com/article/2015/07/10/asus_strix_r9_fury_dc3_video_card_review/1

And I ran the numbers...
Quote:
GTX980: 320 Watts, average FPS: 51.9
R9 Fury: 367 Watts, average FPS: 60.4

GTX980 = 0.16 Frames/watt
Fury = 0.17 Frames/watt

Or

GTX980 = 6.2 watts/frame
Fury = 6.1 watts/frame

So mathematically, meaning empirically, speaking... Kyle's conclusion was wrong.

Bahanime is correct. The R9 Fury delivers more performance per watt than the GTX 980. The R9 Fury is therefore more"efficient" in terms of power usage when compared to delivered performance.

Of course if you run furmark or just load up the Fury, it will end up consuming more power but for gaming? Nope.
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
post #404 of 1175
Quote:
Originally Posted by Mahigan View Post

Quote:
Originally Posted by Defoler View Post

Those two comparisons are unequal. If you actually look at all the numbers and not the final number, you can see that on the nvidia side they also added motherboard consumption, which for some reason is missing from the AMD numbers.
Either they forgot, left out on purpose, but either way, they did not explain why it is not there, or why it is on the nvidia numbers.

Also this is a bit strange, as AMD seems to pull 428.80W from the 300W rated connection during gaming, and nvidia are pulling 358.80W from 225W rated connections. Both of these numbers aren't really making sense, unless on both times they are also adding the 75W from the PCIE lanes, in which case, why double it on the nvidia side as putting it on an extra set under motherboard total?

I tore this myth apart over at HardOCP.

Take this review of the Fury against a GTX 980:
http://m.hardocp.com/article/2015/07/10/asus_strix_r9_fury_dc3_video_card_review/1

And I ran the numbers...
Quote:
GTX980: 320 Watts, average FPS: 51.9
R9 Fury: 367 Watts, average FPS: 60.4

GTX980 = 0.16 Frames/watt
Fury = 0.17 Frames/watt

Or

GTX980 = 6.2 watts/frame
Fury = 6.1 watts/frame

So mathematically, meaning empirically, speaking... Kyle's conclusion was wrong.

Bahanime is correct. The R9 Fury delivers more performance per watt than the GTX 980. The R9 Fury is therefore more"efficient" in terms of power usage when compared to delivered performance.

Of course if you run furmark or just load up the Fury, it will end up consuming more power but for gaming? Nope.
very very interesting and eye opening
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel I7 5820k 4.6ghz core 4.3ghz CPU cache 1.345v asus sabertooth X99 sold R9 290x's am currently running a R9 280A X... 16gb of ddr4 corsair vengeance LPX 3200mhz 1t 1... 
Hard DriveOptical DriveCoolingOS
M.2 Toshiba OCZ RD400 512gb ASUS DVD rw  liquid cooled cpu and gpu's windows 10 pro 64bit 
MonitorKeyboardPowerCase
3 x 32lcd/led 2560x1600 wolfking timberwolf EVGA supernova 1300w G2  Thermaltake core X9 
MouseMouse PadOther
steel series wow cataclysm 14button programable  wolfking sniper Intel PRO PT dual port PCIe server NIC 
  hide details  
Reply
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel I7 5820k 4.6ghz core 4.3ghz CPU cache 1.345v asus sabertooth X99 sold R9 290x's am currently running a R9 280A X... 16gb of ddr4 corsair vengeance LPX 3200mhz 1t 1... 
Hard DriveOptical DriveCoolingOS
M.2 Toshiba OCZ RD400 512gb ASUS DVD rw  liquid cooled cpu and gpu's windows 10 pro 64bit 
MonitorKeyboardPowerCase
3 x 32lcd/led 2560x1600 wolfking timberwolf EVGA supernova 1300w G2  Thermaltake core X9 
MouseMouse PadOther
steel series wow cataclysm 14button programable  wolfking sniper Intel PRO PT dual port PCIe server NIC 
  hide details  
Reply
post #405 of 1175
Quote:
Originally Posted by Charcharo View Post

Well... the VAST majority of games these days do not work on day one, so it seems to me like the public has no idea what they want.
Not really, I was talking about product launch driver. AMD seems to suck at these back then. Had Tahiti (or 7970) launch with a perfect driver, it would have destroy GK104, and force Nvidia to sell it as GTX660Ti.

Another scenario Hawaii GCN, had it have perfect driver, it would be above 780Ti when it launch instead of below 780TI back then. You see everytime AMD have something new, Nvidia is able to counter it.

290X also plague with its crappy reference cooler at launch. Nobody is allow to do AIB for the first few months, u either have to accept the cooler or wait for months. Fury X? It require Close loop cooling, doesnt have custom AIB at launch. I have no idea why would AMD intentionally restricting their flagship. Fury X should have been complete open to AIB at launch.

What about comparing AMD vs Nvidia exclusive features, Nvidia give extra effects on their physx games. AMD on mantle, do they get extra effects particles? nope, despite Mantle have impressive drawcalls, AMD did not capitalize that advantages. With extra drawcalls, developer could have at least throw extra things on the screen without tanking the fps. These are selling point for the public.

now lets back topic and compare Ashes of the Singularity vs Nvidia gimp work title....say Assassin creed. Clearly from here u can see Assassin creed is a far more popular title.

oh yeah I almost forgot--> lets talk about WHQL driver. There was a period where AMD have a 6 months old WHQL driver. It wouldnt matter much for enthusiast since they will use the newer beta driver. But If you are casual, AMD recommend you to download the WHQL, and it obviously not ready for the latest game, the casual will then experience the bugs or anything not working on that WHQL driver. He will then think "AMD driver suck".
Quote:
Originally Posted by Defoler View Post

Those two comparisons are unequal. If you actually look at all the numbers and not the final number, you can see that on the nvidia side they also added motherboard consumption, which for some reason is missing from the AMD numbers.
Either they forgot, left out on purpose, but either way, they did not explain why it is not there, or why it is on the nvidia numbers.

Also this is a bit strange, as AMD seems to pull 428.80W from the 300W rated connection during gaming, and nvidia are pulling 358.80W from 225W rated connections. Both of these numbers aren't really making sense, unless on both times they are also adding the 75W from the PCIE lanes, in which case, why double it on the nvidia side as putting it on an extra set under motherboard total?
The 2 product also share the same TDP rating. So it is not surprise it shares the same power consumption. Fiji was out after AMD loss considerable market share.

he seems to conveniently ignore rolleyes.gif 290X vs GTX 980(Hawaii's name during launch), R9-285 vs GTX960, GTX950 vs 7870/270x, 750Ti vs 265. This is like the whole product stack having less efficient/performance.


This the reason why I said AMD has lackluster GPU overall. They need to have a better "complete package" in everything.
Edited by Clocknut - 2/28/16 at 6:05pm
Gaming Rig
(10 items)
 
Work/Web Rig
(11 items)
 
Web Rig
(8 items)
 
CPUMotherboardGraphicsRAM
i5-2500K @4GHz stock volt OC Asus P8Z77-V LK Palit GTX 750Ti StormX Dual Corsair Vengence 1600MHz_CL8 4GBx4 
Hard DriveHard DriveCoolingOS
Seagate 80GB 7200rpm Maxtor 250GB 7200rpm Cooler Master Hyper 212+ Windows 10 Pro 
MonitorCase
BenQ XL2720z @ 1920x1080 144Hz Silverstone Ps06 
CPUMotherboardGraphicsRAM
Core 2 Quad Q9650 Gigabyte EP41-UD3L Asus 1GB Radeon 7790 DirectCU OC Corsair Gaming Ram 2GBx2 DDR2-800 
Hard DriveHard DriveCoolingOS
Samsung 80GB 7200rpm Western Digital 200GB 7200rpm Cooler Master Hyper TX3 Windows 7 pro 
MonitorPowerCase
Samsung 226bw 1680x1050 Acbel iPower 510w (450w PSU) LianLi PS05 
CPUMotherboardGraphicsRAM
AMD E350 MSI E350IA-E45 AMD Radeon 6310 4GB Kingston DDR3 1333 
Hard DriveOSMonitorPower
Western Digital 80GB Linux Slackware 14.1 Samsung 22" 1680x1050 LCD Acbel 300w PSU 
  hide details  
Reply
Gaming Rig
(10 items)
 
Work/Web Rig
(11 items)
 
Web Rig
(8 items)
 
CPUMotherboardGraphicsRAM
i5-2500K @4GHz stock volt OC Asus P8Z77-V LK Palit GTX 750Ti StormX Dual Corsair Vengence 1600MHz_CL8 4GBx4 
Hard DriveHard DriveCoolingOS
Seagate 80GB 7200rpm Maxtor 250GB 7200rpm Cooler Master Hyper 212+ Windows 10 Pro 
MonitorCase
BenQ XL2720z @ 1920x1080 144Hz Silverstone Ps06 
CPUMotherboardGraphicsRAM
Core 2 Quad Q9650 Gigabyte EP41-UD3L Asus 1GB Radeon 7790 DirectCU OC Corsair Gaming Ram 2GBx2 DDR2-800 
Hard DriveHard DriveCoolingOS
Samsung 80GB 7200rpm Western Digital 200GB 7200rpm Cooler Master Hyper TX3 Windows 7 pro 
MonitorPowerCase
Samsung 226bw 1680x1050 Acbel iPower 510w (450w PSU) LianLi PS05 
CPUMotherboardGraphicsRAM
AMD E350 MSI E350IA-E45 AMD Radeon 6310 4GB Kingston DDR3 1333 
Hard DriveOSMonitorPower
Western Digital 80GB Linux Slackware 14.1 Samsung 22" 1680x1050 LCD Acbel 300w PSU 
  hide details  
Reply
post #406 of 1175
Quote:
Originally Posted by Mahigan View Post

I tore this myth apart over at HardOCP.

Take this review of the Fury against a GTX 980:
http://m.hardocp.com/article/2015/07/10/asus_strix_r9_fury_dc3_video_card_review/1

And I ran the numbers...
Of course if you run furmark or just load up the Fury, it will end up consuming more power but for gaming? Nope.

And yet:



https://www.techpowerup.com/mobile/reviews/ASUS/R9_Fury_Strix/32.html
Edited by Forceman - 2/28/16 at 6:00pm
post #407 of 1175

It all comes down to how you tested. If all you did was max out the Fury and then compared its maxed power usage to its avg frames per second you'd net that result.

What Hard OCP did was calculate the average power usage while playing those games. So all I had to do was compare it to the avg FPS numbers from HardOCP.

That gave me those results. Feel free to calculate it yourself.
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
post #408 of 1175
Quote:
Originally Posted by Mahigan View Post

It all comes down to how you tested. If all you did was max out the Fury and then compared its maxed power usage to its avg frames per second you'd net that result.

What Hard OCP did was calculate the average power usage while playing those games. So all I had to do was compare it to the avg FPS numbers from HardOCP.

That gave me those results. Feel free to calculate it yourself.

So like this:
Quote:
The following graphs show the efficiency of the cards in our test group. We used the relative performance scores and the typical gaming power consumption result. These numbers are based on the performance summary with all games included.
post #409 of 1175
The 1920x1080 performance summary numbers dont seem to add up correctly.

980 is consistently 7-30fps behind the Fury Strix aside from a few games (P Cars, WoW, CoD) but is rated at 100% performance same as Fury Strix at the end.

I fully admit I did not break out a calculator and add every number together but the numbers do look off just keeping a running tally of fps deviations as I clicked through the pages.

That said, there should be some debate over how the benchmarks are done as well as running a "timedemo" style benchmark gives deceiving numbers in what actual performance will be.
Edited by STEvil - 2/28/16 at 6:37pm
post #410 of 1175
Quote:

Not saying i agree with whether or not a Fury is more efficient than a 980, because to me 50 watts either way is a mute point for high-end hardware anyway. But you can throw that graph out the window thanks to the disgrace that is Pcars being used in that average.




Titles like that aren't going to throw off the results are they? rolleyes.giflachen.gif
Edited by GorillaSceptre - 2/28/16 at 6:36pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Software News
Overclock.net › Forums › Industry News › Software News › [Anand] Ashes of the Singularity Revisited: A Beta Look at DirectX 12 & Asynchronous Shading