Overclock.net › Forums › Industry News › Hardware News › [Various] Futuremark Releases 3DMark Time Spy DirectX 12 Benchmark
New Posts  All Forums:Forum Nav:

[Various] Futuremark Releases 3DMark Time Spy DirectX 12 Benchmark - Page 68

post #671 of 772
If something is well understood, it would extremely easy to dispel any FUD. The sad truth is the muddier the understanding, the easier it is for FUD to spread like wildfire.
post #672 of 772
Quote:
Originally Posted by Mahigan View Post

And that might have something to do with those games having AMD optimized paths moreso than NV optimized paths. That is sort of what I am getting out of all of this. NV consumers might be getting jerked around in the near future if this trend holds. Clearly... if optimized differently... NV hardware can perform better.

Is it fair? If we set aside any personal bias we may have that is.

I am not talking is it fair to AMD or NV... I am saying... is it fair to RemiJ or other nV users?

And if games end up more like 3D mark... is it fair for AMD consumers?
That's why i believe 2 code path for both are the way to go. use both to their fullest and may the better hardware will win. means healthy competition. not this crap we are seeing nowadays.
post #673 of 772
Quote:
Originally Posted by Hueristic View Post

Well if you paid for this DX11 Benchmark then get your money back. It's called false advertising and they are culpable.

This is a false statement. Time Spy uses DX12 API and DX12 API only.
post #674 of 772
Quote:
Originally Posted by Mahigan View Post

The issue is that both red and green team have an interest in getting the most band for their buck. This should be something which unifies everyone. We should be calling for synthetic benchmarks which are fully optimized for both IHVs because games will be using GPUOpen and nVIDIAs equivalent. Games are partnering with AMD and nVIDIA and games are being tailored for one architecture over the other.

So why do both AMD and NVIDIA specifically ask us (Futuremark) to *not* do vendor-specific paths, as it would make 3DMark less useful to them?

They don't seem to be calling for a synthetic benchmark that has multiple execution paths, but I assume you know better.
post #675 of 772
Quote:
Originally Posted by Majin SSJ Eric View Post

I'm certainly not outraged at all. I would just like to see a DX 12 benchmark that actually utilizes all of the features of the DX 12.

Problem is that the graphics card market today cannot run such a benchmark and Futuremark would be roasted alive because such a benchmark wouldn't run on most cards that can run DX12 games. (it would rule out Haswell, Broadwell, GCN 1.0 and Kepler)

DX12 API is the main thing and brings the largest difference vs. older APIs. FL12 is nice and all, but if we'd go for full FL12_1 requirement, no AMD card could run the test. Is that what you suggest we should've done?

...and yes, I'm fully aware of various tiers of various features and how different architectures support them. It is frankly quite a tangle and will make benchmark development for FL12... interesting. We can talk about that when the *next* 3DMark ships, okay? biggrin.gif
Edited by FMJarnis - 7/19/16 at 10:55pm
post #676 of 772
Quote:
Originally Posted by FMJarnis View Post

So why do both AMD and NVIDIA specifically ask us (Futuremark) to *not* do vendor-specific paths, as it would make 3DMark less useful to them?

They don't seem to be calling for a synthetic benchmark that has multiple execution paths, but I assume you know better.

Because games were called to have multiple execution paths (IHV specific paths) during the last GDC by AMD and nVIDIA (as well as Microsoft).

If 3DMark does not incorporate IHV specific paths then what is the use of 3DMark? If 3DMark does not mirror what games are doing then what is it good for? What can it tell us about the hardware and how it will behave under DX12 titles?

You are siding with the Corporations... and while many gamers may be red or green team... at the end of the day what they care about is getting to play their games without being shafted due to Corporate agreements between some big game studio and a specific IHV.

If you look at the damage Gameworks has done to several consumers (Project Cars comes to mind) then I guess you can see why many folks are reluctant to accept this sort of behavior.

As for me *knowing better*... are we not the consumers? Are we not the ones purchasing these products? Are we not the ones who were shafted by the GTX 970 claiming 4GB memory but coming equipped with 3.5GB or the ones who bought an RX 480 thinking it was a 150W card only to find that it consumed considerably more?

Who buys 3DMark? AMD and nVIDIA? or Gamers?
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
Kn0wledge
(20 items)
 
Pati3nce
(14 items)
 
Wisd0m
(10 items)
 
Reply
post #677 of 772
Quote:
That's why i believe 2 code path for both are the way to go. use both to their fullest and may the better hardware will win. means healthy competition. not this crap we are seeing nowadays.

This is a problem for several reasons no the least of which being that developers may not have the time and resources and funding to basically write a games code twice. Look how long it already takes to get ports of console games. You're talking similar work loads if you take optimization for each team to the extreme. A lot of this becomes moot if AMD gets Vulcan on all the consoles running on AMD hardware. Like it or not the gaming market lives and dies by console gaming sales as a whole. PC is just a small slice of the pie. If consoles become unified in their API with Vulcan, which can near seamlessly be ported to PC then game developers have a huge incentive to not only learn Vulcan but to use it to decrease work loads on development teams. And who's hardware will they be optimizing for on consoles? AMD. And who's hardware will be 100% ready to take advantage of said optimizations with a seamless ports to PC? AMD.

Its not like NV cards don't perform well on AMD optimized code as it is. They just don't get the bragging rights for being the undisputed fastest or having the leads they once carried. It sucks when the market is stacked against you as far as what your tech is capable of Vs what developers are willing and able to code for. I can see the legitimate case for the tables being turned in the next 5 years from an API that is crippled from supporting AMD tech to one that is incentivised to do so. Whether you are a fan boy of one camp or the other, the news is good for us as consumers because as the pendulum swings we will reap the benefits of a highly contested market space with lower prices and better performance per dollar than ever before.

That's my hope. Make it a good fight boys, and may the best team win. And by that i mean the TEAM consumer! thumb.gif
post #678 of 772
Quote:
Originally Posted by FMJarnis View Post

Problem is that the graphics card market today cannot run such a benchmark and Futuremark would be roasted alive because such a benchmark wouldn't run on most cards that can run DX12 games. (it would rule out Haswell, Broadwell, GCN 1.0 and Kepler)

DX12 API is the main thing and brings the largest difference vs. older APIs. FL12 is nice and all, but if we'd go for full FL12_1 requirement, no AMD card could run the test. Is that what you suggest we should've done?

...and yes, I'm fully aware of various tiers of various features and how different architectures support them. It is frankly quite a tangle and will make benchmark development for FL12... interesting. We can talk about that when the *next* 3DMark ships, okay? biggrin.gif

Why isn't it possible to offer a benchmark that supports all of the DX 12 feature set and then allow them to either be turned on or off depending on what hardware you have? You can still do apples to apples comparison's depending on which feature set was selected, right?
post #679 of 772
Quote:
Originally Posted by Mahigan View Post

Who buys 3DMark? AMD and nVIDIA? or Gamers?

Both.

AMD, NVIDIA, Intel and various other members pay a lot more for it. They want it to be an impartial and unbiased tool. Plenty of biased games out there to test with, I suppose.
Quote:
Originally Posted by Majin SSJ Eric View Post

Why isn't it possible to offer a benchmark that supports all of the DX 12 feature set and then allow them to either be turned on or off depending on what hardware you have? You can still do apples to apples comparison's depending on which feature set was selected, right?

Well, one obvious thing is that if you do effects that rely on some specific feature that is not available on all cards, the rendering output is different on different cards. Not a very fair comparison if all do not do the same work.

But hey, this is definitely is something our engineering team sure has to work out when next the 3DMark is developed. And we'll have AMD, NVIDIA and Intel engineers right there in the discussion giving their input. Time Spy is our first DX12 test, it won't be the only one.
post #680 of 772
Quote:
Originally Posted by Majin SSJ Eric View Post

Why isn't it possible to offer a benchmark that supports all of the DX 12 feature set and then allow them to either be turned on or off depending on what hardware you have? You can still do apples to apples comparison's depending on which feature set was selected, right?

Not sure why that's so hard to understand, especially since past 3dmarks did exactly that. A single render path for all vendors has only been the case for the past couple releases, prior to that different sections of the bench were opened up depending on the GPU feature set.
Parasite
(18 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770K @ 4.7GHz Z87 MPOWER (MS-7818) Sapphire Radeon 290x @1100/1500 EVGA 1080Ti SC2 Hybrid 
RAMHard DriveHard DriveCooling
G.SKILL 2133 Samsung 850 Pro Caviar Black Corsair H100 
CoolingCoolingOSMonitor
Corsair HG10 Corsair H60 Windows 7 x64 Sony XBR65X850B 
KeyboardPowerCaseMouse
CMSTORM Quickfire XT Corsair AX1200i Antec P280 Logitec G700 
Mouse PadAudio
Black, came with my NeXTcube 25 years ago. Sound Blaster Recon 3D PCIe 
  hide details  
Reply
Parasite
(18 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770K @ 4.7GHz Z87 MPOWER (MS-7818) Sapphire Radeon 290x @1100/1500 EVGA 1080Ti SC2 Hybrid 
RAMHard DriveHard DriveCooling
G.SKILL 2133 Samsung 850 Pro Caviar Black Corsair H100 
CoolingCoolingOSMonitor
Corsair HG10 Corsair H60 Windows 7 x64 Sony XBR65X850B 
KeyboardPowerCaseMouse
CMSTORM Quickfire XT Corsair AX1200i Antec P280 Logitec G700 
Mouse PadAudio
Black, came with my NeXTcube 25 years ago. Sound Blaster Recon 3D PCIe 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [Various] Futuremark Releases 3DMark Time Spy DirectX 12 Benchmark