Overclock.net › Forums › Industry News › Hardware News › [Various] Futuremark Releases 3DMark Time Spy DirectX 12 Benchmark
New Posts  All Forums:Forum Nav:

[Various] Futuremark Releases 3DMark Time Spy DirectX 12 Benchmark - Page 66

post #651 of 772
Quote:
Originally Posted by Forceman View Post

So Futuremark makes a AMD specific path and an Nvidia specific path. Wouldn't that kind of defeat the purpose of calling it a benchmark, since you wouldn't be able to compare the scores? Plus you'd have the inevitable "path X is not doing the same work/visuals/whatever". It's really a no win situation for them.

You do compare the scores...

The point is that with Vulkan and DX12 we have entered uncharted territory. Some disagree with me and I get that but we are not under the olden days where vendor specific optimizations were frowned upon. We basically have entered a new reality whereas vendor specific optimizations will be used and we need tools to determine the objectivity of game developers.

I think we can all agree that Total War Warhammer is not objective in its DX12 implementation. Problem is... we have no evidence to back this up other than seeing Pascal struggling in that title.

If we had a tool (a benchmark) which was optimized for both vendors then we would know when someone is not playing fair (not playing fair meaning that they have implemented a single path optimized for a particular IHV).

3DMark ought to be playing that role imo.

New game benchmarks will be based on optimized paths. Make no mistake about it.

Just the other day RemiJ was telling me that we cannot count Doom Vulkan results because NV does not yet have its Async path implemented (or maybe that was someone else). It got me thinking and seeing a larger issue.

The issue is that both red and green team have an interest in getting the most band for their buck. This should be something which unifies everyone. We should be calling for synthetic benchmarks which are fully optimized for both IHVs because games will be using GPUOpen and nVIDIAs equivalent. Games are partnering with AMD and nVIDIA and games are being tailored for one architecture over the other.

We have no way of determining just how one sided these games are because we lack the tools... so what we are left with are flame wars between team green and team red.

Something has got to give.
Edited by Mahigan - 7/19/16 at 8:27pm
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
post #652 of 772
Oh and for people claiming that concurrency and parallelism are the same thing... no.

http://www.linux-mag.com/id/7411/

Concurrency and parallelism are two related but distinct concepts. Parallel execution is when the concurrent parts are executed at the same time on separate processors.
Quote:
Let’s start with concurrency. A concurrent program or algorithm is one where operations can occur at the same time. For instance, a simple integration, where numbers are summed over an interval. The interval can be broken into many concurrent sums of smaller sub-intervals. As I like to say, concurrency is a property of the program. Parallel execution is when the concurrent parts are executed at the same time on separate processors. The distinction is subtle, but important. And, parallel execution is a property of the machine, not the program. If execution efficiency is important (i.e. you want things to go faster by adding more cores), then the question you need to ask is “If I run everything that is concurrent in parallel, will my code run faster?” If the answer were “yes” then we would not be having this discussion. And, since the answer, is “no”, then the question is “What should run in parallel?” which is obviously, the portions of code that lower execution time.

And that is what I was talking about when I mentioned the differences between the two concepts. GCN can handle both concurrent and parallel executions of tasks. Concurrent executions fill up gaps in the execution pipeline whereas Parallel executions end up executing two things at once. In essence... you end up with a very high execution efficiency.

Case in point..

What GCN can do...
Quote:
Command lists within a given queue must execute synchronously, while those in different queues can execute asynchronously (i.e. concurrently and in parallel). Overlapping tasks in multiple queues maximize the potential for performance improvement.


Just looking at the Graphics and Compute tasks (ignoring the Copy queue for now)...

Parallel
Shadow Maps - G-buffer - Transparent and UI running in parallel with Physics (partical Sim) - Prepare SM - Tile Deferred

Concurrent
AA/AO and Tonemap

Those two terms are related but distinct. Anyone who tells you otherwise is likely getting their knowledge from a dictionary and not from the IT field.
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
post #653 of 772
If AMD has approved and apparently has no problem with how Time Spy works, all these outrage is pretty much pointless.
My home PC
(16 items)
 
  
CPUMotherboardGraphicsRAM
AMD Threadripper 1950x Gigabyte Aorus X399 Gaming 7  MSI Geforce GTX 1080ti Gaming X G.Skill DDR4 3600 CL16 
Hard DriveHard DriveCoolingOS
Samsung Evo 840 500GB Samsung 960 Pro 500GB Noctua NH-U14S TR4 Windows 10 Pro 
MonitorMonitorKeyboardPower
Dell U2711 Samsung 55" 4k Corsair K70  EVGA SuperNova G2 1300W 
CaseMouseAudio
Corsair Carbide Air 540 Logitech G502 Denon AVR-X3300W 
  hide details  
Reply
My home PC
(16 items)
 
  
CPUMotherboardGraphicsRAM
AMD Threadripper 1950x Gigabyte Aorus X399 Gaming 7  MSI Geforce GTX 1080ti Gaming X G.Skill DDR4 3600 CL16 
Hard DriveHard DriveCoolingOS
Samsung Evo 840 500GB Samsung 960 Pro 500GB Noctua NH-U14S TR4 Windows 10 Pro 
MonitorMonitorKeyboardPower
Dell U2711 Samsung 55" 4k Corsair K70  EVGA SuperNova G2 1300W 
CaseMouseAudio
Corsair Carbide Air 540 Logitech G502 Denon AVR-X3300W 
  hide details  
Reply
post #654 of 772
"New game benchmarks will be based on optimized paths. Make no mistake about it."


So at that point, doesnt it just make sense to have nvidia and AMD make their own benchmarks. Who is going to have better knowldege of how to optimize their own hardware than the companies who make the hardware.

The problem with DX12 is that we have all been shown its potential when things are optimized for hardware...but the tools necessary to do such optimizations are not part of the fundamental industry standard that is DX12. They are simply optional tools developers get to use if they wish. So when a game doesn't work to a level that is optimal for what the hardware is capable of...its on the game developers. Not necessarily the hardware. That i think was the whole point. AMD have shown an inability to play by nvidia and intels rules. They cant compete making the same tech. What they did instead was leverage the tech they were good at making...and then showing the world that hey if you developers use the right tools this tech is really good. Now developers are at a crossroad. They have to decide what tools to put into their games. Some will favor green tech. Some red tech. Some will use a balanced approach and not get then most out of either hardware. Developers have limited resources too. They have to choose. Luckily in some cases added support can always be added in after the fact like in doom.

The detail that i think seals the fate if the future market is this...consoles make up something like 90% of gaming revenue. All 3 next gen major consoles will be built on AMD hardware and if they get their way Vulcan will the primary API for all 3. That means seamless porting between consoles and into the PC market. And guess what hardware is going to be sitting on PC right now already optimized for that Vulcan code? That kind of influence cant be ignored. Nvidia will have to start playing AMDs game on some level if that happens. Otherwise their tech will show missing performance on non green optimized titles. I have no doubt their next generation of architecture will be more similar to AMDs GCN, which will push for more and more of the tools in the tool box to become industry standards, not optional features. That is my guess for the future for what its worth. Not much i know.

[edited for spelling]
Edited by gapottberg - 7/19/16 at 9:15pm
post #655 of 772
I'm certainly not outraged at all. I would just like to see a DX 12 benchmark that actually utilizes all of the features of the DX 12.
post #656 of 772
Quote:
Originally Posted by Mahigan View Post


Just the other day RemiJ was telling me that we cannot count Doom Vulkan results because NV does not yet have its Async path implemented (or maybe that was someone else). It got me thinking and seeing a larger issue.

I believe that was me, but I think it's fair to give a bit of context. It's not like I was simply saying that Doom's Vulkan results shouldn't be counted. I was responding, in kind, to the criticism that TimeSpy shouldn't be counted because people were claiming it leans towards favoring Nvidia's architecture. It was simply the point that if Nvidia's "async" path (put in quotations because it's different from AMDs) is being worked on as the developer stated, then those benchmarks are ultimately worthless because it would soon change, and the case could be made that Doom's Vulkan code path was favoring AMDs architecture since they didn't wait to release the "async" code paths for both at the same time. If Nvidia and ID never release an Vulkan "Async" update, then the results stand as they are.

We all realize that synthetic benchmarks should be serving the purpose of strictly measuring hardware off to it's fullest in a fair, unbiased way, and games themselves inherently deal with a wider range of variables and markets to consider and thus may show more favorably to one side or the other. That's simply the business. I don't think what TimeSpy is doing is disingenuous. I think it shows fair results for what it is. I just think that it could be a bit clearer in the app itself as to what level of DX12 featureset it's using.

For the record I think all these benchmarks count. In retrospect this predicament with 3DMark has actually been useful for something right? Whether that's to gauge baseline performance in DX12 FL11, or to bring further insight and understanding into what DX12 actually does and what exactly it's defined as. As well as how Futuremark conducts and designs it's benchmarks.

I learned a lot of things I didn't know before because of this.
Edited by Remij - 7/19/16 at 9:05pm
My main PC
(8 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700k Asus ROG Maximus VIII Gene Nvidia GTX 1080Ti G.Skill Ripjaws 
Hard DriveOSKeyboardPower
Samsung 850 EVO  Windows 10 Razer Blackwidow Chroma EVGA Supernova 1300w 
  hide details  
Reply
My main PC
(8 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700k Asus ROG Maximus VIII Gene Nvidia GTX 1080Ti G.Skill Ripjaws 
Hard DriveOSKeyboardPower
Samsung 850 EVO  Windows 10 Razer Blackwidow Chroma EVGA Supernova 1300w 
  hide details  
Reply
post #657 of 772
Quote:
Originally Posted by gapottberg View Post

"New game benchmarks will be based on optimized paths. Make no mistake about it."


So at that point, doesnt it just make sense to have nvidia and AMD make their own benchmarks. Who is going to have better knowldege of how to optimize their own hardware than the companies who make the hardware.

The problem with DX12 is that we have all been shown its potential when things are optimized for hardware...but the tools neccessery to do such optimizations are not part of the fundamental industry standard that is DX12. They are simply optional tools developers get to use if they wish. So when a game doesnt work to a level that is optimal for whatbthe hardware is capable of...its on thengame developers. Not nessecarily the hardware. That i think was the whole point. AMD have shown an inability to play by nvidia and intels rules. They cant compete making the same tech. What they did instead was leverage the tech they were good at making...and then showing the world that hey if you developers use the right tools this tech is really good. Now developers are at a crossroad. They have to decide what tools to put into their games. Some will favor green tech. Some red tech. Some will use a balanced apporach and not get then most out of either hardware. Developers have limited resources too. They have to choose. Luckily in some cases added support can always be added in after the fact like in doom.

The detail that i think seals the fate if the future market is this...consoles makenup simething like 90% of gaming revenue. All 3 next gen major consolse will be built on AMD hardware andnifnthey get their way Vulcan will the primary API for all 3. That means seemless porting between consoles and into the PC market. And guess what hardware is going to be sitting on PC right now already optimized for that Vulcan code? That kind of influnce cant be ignored. Nvidia will have to start playing AMDs game on some level if that happens. Otherwise their tech will show missing performance on non green optimized titles. I have no doubt their next generation of architeture will be more similar to AMDs GCN, which will push for more and more of the tools in the tool box to become industry standards, not optional features. Thats my guess for the future for what its worth. Not much i know.

And believe it or not... I personally do not like that idea. This has to potential to hurt NV consumers badly. PC-only titles have the potential to hurt AMD consumers badly.
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
post #658 of 772
The thing is they made it show the difference between amd vs nv in upcoming games what they believe will perform like. but that's not actually right, since all dx12 games which are available right now shows totally different picture.
post #659 of 772
Quote:
Originally Posted by Mahigan View Post

And believe it or not... I personally do not like that idea. This has to potential to hurt NV consumers badly. PC-only titles have the potential to hurt AMD consumers badly.

Welcome to the DX12/Vulkan era. We are probably going to see larger performance disparity between sponsored titles more than ever.
My home PC
(16 items)
 
  
CPUMotherboardGraphicsRAM
AMD Threadripper 1950x Gigabyte Aorus X399 Gaming 7  MSI Geforce GTX 1080ti Gaming X G.Skill DDR4 3600 CL16 
Hard DriveHard DriveCoolingOS
Samsung Evo 840 500GB Samsung 960 Pro 500GB Noctua NH-U14S TR4 Windows 10 Pro 
MonitorMonitorKeyboardPower
Dell U2711 Samsung 55" 4k Corsair K70  EVGA SuperNova G2 1300W 
CaseMouseAudio
Corsair Carbide Air 540 Logitech G502 Denon AVR-X3300W 
  hide details  
Reply
My home PC
(16 items)
 
  
CPUMotherboardGraphicsRAM
AMD Threadripper 1950x Gigabyte Aorus X399 Gaming 7  MSI Geforce GTX 1080ti Gaming X G.Skill DDR4 3600 CL16 
Hard DriveHard DriveCoolingOS
Samsung Evo 840 500GB Samsung 960 Pro 500GB Noctua NH-U14S TR4 Windows 10 Pro 
MonitorMonitorKeyboardPower
Dell U2711 Samsung 55" 4k Corsair K70  EVGA SuperNova G2 1300W 
CaseMouseAudio
Corsair Carbide Air 540 Logitech G502 Denon AVR-X3300W 
  hide details  
Reply
post #660 of 772
Quote:
Originally Posted by Mahigan View Post

And believe it or not... I personally do not like that idea. This has to potential to hurt NV consumers badly. PC-only titles have the potential to hurt AMD consumers badly.

It's been brought up before, but DX12 has the potential to really split the marketplace unless both AMD and Nvidia start to converge on features. If all AMD supported titles run heavy async to the detriment of Nvidia, and then all Gameworks titles feature even more Nvidia specific features to the detriment of AMD, you will end up with games where the performance could be massively one-sided to one IHV or the other. Which is bad for everyone.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [Various] Futuremark Releases 3DMark Time Spy DirectX 12 Benchmark