[Various] Ashes of the Singularity DX12 Benchmarks - Page 120 - Overclock.net - An Overclocking Community

Forum Jump: 

[Various] Ashes of the Singularity DX12 Benchmarks

 
Thread Tools
post #1191 of 2682 (permalink) Old 08-29-2015, 09:23 AM
New to Overclock.net
 
Noufel's Avatar
 
Join Date: Apr 2012
Location: Constantine, Algeria
Posts: 1,548
Rep: 54 (Unique: 40)
Quote:
Originally Posted by GorillaSceptre View Post

Quote:
Originally Posted by Themisseble View Post

Its very simple. As mahigan explained that AMD has advantages in DX12.
- API overhead (no more CPu bottleneck at 1080P)
- Better parallelism
- async shaders

NVIDIA will always be better at tessellation on DX9, DX11 or DX12. Tessellation take huge perf. on rasterizer efficiency - with better tessellation and 50% more ROPs NVIDIA has huge advantages.

I think you quoted the wrong post?

As far as Mahigans theory goes, well.. I won't use the word debunked, but people on the more tech savvy forums disagree with his reasoning. The consensus seems to be that the biggest differences will come from how the game is programmed, not as simple as X is better than Y.
Not good if true, i think the majority of devs will programme their games better on nvidia gpus ( market shares and all ) and amd will continue to have problems of performance against nvidia but less than when they were on dx11 frown.gif
Noufel is offline  
Sponsored Links
Advertisement
 
post #1192 of 2682 (permalink) Old 08-29-2015, 09:33 AM
Linux Lobbyist
 
semitope's Avatar
 
Join Date: Jul 2013
Location: Florida/Jamaica
Posts: 536
Rep: 32 (Unique: 20)
Quote:
Originally Posted by Noufel View Post

Not good if true, i think the majority of devs will programme their games better on nvidia gpus ( market shares and all ) and amd will continue to have problems of performance against nvidia but less than when they were on dx11 frown.gif


Most AAA games are on consoles. It's unlikely the programming will favor nvidia when gameworks is not involved. Also I doubt its just down to how things are programmed. Sure you might reduce nvidias hardware issues by making things harder on the devs, but there are hardware aspects all the same. Doing things in a way that might help nvidia in this case probably won't mess with AMDs performance if it's still standard dx12 usage.
semitope is offline  
post #1193 of 2682 (permalink) Old 08-29-2015, 09:37 AM
New to Overclock.net
 
Dudewitbow's Avatar
 
Join Date: Oct 2012
Location: Pittsburg California
Posts: 1,736
Rep: 87 (Unique: 71)
Quote:
Originally Posted by Noufel View Post

Not good if true, i think the majority of devs will programme their games better on nvidia gpus ( market shares and all ) and amd will continue to have problems of performance against nvidia but less than when they were on dx11 frown.gif
just keep in mind that AAA games in question tend to be on console too, which are GCN powered. market share is relatively equal if consoles are added into AMD's favor. A dev would probably be more prone to optimizing for nvidia gpus(disregarding amd or nvidia influences) if its a standalone PC title as the developer themselves dont have to necessarily work with GCN hardware directly at one point in development.

Dudewitbow is offline  
Sponsored Links
Advertisement
 
post #1194 of 2682 (permalink) Old 08-29-2015, 09:48 AM
New to Overclock.net
 
GorillaSceptre's Avatar
 
Join Date: Oct 2014
Posts: 2,373
Rep: 334 (Unique: 132)
Quote:
Originally Posted by semitope View Post

I suggested looking at other parts of the system many posts ago.

I want a copy of mgs5
Most AAA games are on consoles. It's unlikely the programming will favor nvidia when gameworks is not involved. Also I doubt its just down to how things are programmed. Sure you might reduce nvidias hardware issues by making things harder on the devs, but there are hardware aspects all the same. Doing things in a way that might help nvidia in this case probably won't mess with AMDs performance if it's still standard dx12 usage.

That's assuming there is hardware issues, and most disagree with that assessment.

We need more evidence than a single benchmark from an alpha game, from a studio that has been heavily involved in promoting mantle. Nvidia also came out and said the results aren't representative.

I'm not saying anyone's lying or that Stardock are bought off, just that we need more testing. Not to mention that it isn't exactly a land slide win for AMD, they are within a couple frames of each other, with Nvidia sometimes coming out on top.

With all the talk of DX12 and it's performance benefits, Nvidia actually LOOSING performance seems a bit strange.

CPU: 2600k @ 4.8GHZ
MB: Maximus IV Extreme-Z
GPU: MSI 390X
CASE: 650D
AUDIO: Xonar Essence One-AKG Q701
GorillaSceptre is offline  
post #1195 of 2682 (permalink) Old 08-29-2015, 09:50 AM
New to Overclock.net
 
KSIMP88's Avatar
 
Join Date: Jan 2005
Location: Your moms
Posts: 9,826
Rep: 395 (Unique: 286)
Oh please let TES6 be DX12

Hodor, hodor hodor hodor hodor. Hodor hodor hodor hodor hodor hodor, hodor hodor. Hodor, hodor hodor, hodor hodor hodor hodor. -Hodor


KSIMP88 is offline  
post #1196 of 2682 (permalink) Old 08-29-2015, 09:58 AM
Linux Lobbyist
 
semitope's Avatar
 
Join Date: Jul 2013
Location: Florida/Jamaica
Posts: 536
Rep: 32 (Unique: 20)
Quote:
Originally Posted by GorillaSceptre View Post

That's assuming there is hardware issues, and most disagree with that assessment.

We need more evidence than a single benchmark from an alpha game, from a studio that has been heavily involved in promoting mantle. Nvidia also came out and said the results aren't representative.

I'm not saying anyone's lying or that Stardock are bought off, just that we need more testing. Not to mention that it isn't exactly a land slide win for AMD, they are within a couple frames of each other, with Nvidia sometimes coming out on top.

With all the talk of DX12 and it's performance benefits, Nvidia actually LOOSING performance seems a bit strange.

losing performance is not strange when you hardware is more suited for another API. The actual framerate loss is probably bigger since it wipes out aspects of dx12 that benefit nvidia. Their hardware limitation might be more significant than the results suggest. Sucking at an API feature is not something new. Supporting is not the same as doing it well. My guess is that a "representative" dx12 game will be one that does not use a feature they suck at.
semitope is offline  
post #1197 of 2682 (permalink) Old 08-29-2015, 10:05 AM
New to Overclock.net
 
GorillaSceptre's Avatar
 
Join Date: Oct 2014
Posts: 2,373
Rep: 334 (Unique: 132)
Quote:
Originally Posted by semitope View Post

losing performance is not strange when you hardware is more suited for another API. The actual framerate loss is probably bigger since it wipes out aspects of dx12 that benefit nvidia. Their hardware limitation might be more significant than the results suggest. Sucking at an API feature is not something new. Supporting is not the same as doing it well. My guess is that a "representative" dx12 game will be one that does not use a feature they suck at.

You're obviously taking Mahigans theory as fact. We may as well leave this discussion here then.

CPU: 2600k @ 4.8GHZ
MB: Maximus IV Extreme-Z
GPU: MSI 390X
CASE: 650D
AUDIO: Xonar Essence One-AKG Q701
GorillaSceptre is offline  
post #1198 of 2682 (permalink) Old 08-29-2015, 10:15 AM
Linux Lobbyist
 
semitope's Avatar
 
Join Date: Jul 2013
Location: Florida/Jamaica
Posts: 536
Rep: 32 (Unique: 20)
Quote:
Originally Posted by GorillaSceptre View Post

You're obviously taking Mahigans theory as fact. We may as well leave this discussion here then.

Actually wasn't the first source for this information. Got the information from a developer talking about VR IIRC. Nvidias hardware is not as robust currently.

It makes all kinds of sense and I expect things might get worse. A fury X matching a 980ti means its still not being fully utilized.
semitope is offline  
post #1199 of 2682 (permalink) Old 08-29-2015, 11:39 AM
New to Overclock.net
 
Join Date: Jul 2013
Location: Purgatory
Posts: 2,280
Rep: 125 (Unique: 82)
Quote:
Originally Posted by Noufel View Post

Not good if true, i think the majority of devs will programme their games better on nvidia gpus ( market shares and all ) and amd will continue to have problems of performance against nvidia but less than when they were on dx11 frown.gif

Plus rep, as you have hit the gist of the counter argument on the head.
Mahigan's theory appeals to me because he has gone to great lengths to research and share his opinions as to the why AMD's architecture works better than Nvidia if the developers properly utilize the benefits of Dx12 to reduce the overhead. All I have seen by way of counter argument is why his theory doesn't work due to yet to be seen optimizations for Nvidia, which I interpret as follows:

a) until Nvidia catches up with Pascal architecture or
b) until developers have been incented enough to code away from consumer friendly dx 12 to put the pc gamers in the same position as they were in with dx11, I.e there ain't no such thing as a free (lunch) performance, if you what more performance you got to pay for it. tongue.gif

But, no one has proposed an alternative detailed theory that demystifies the dx 12 performance riddle of the GPU makers tongue.gif

Simplicity
provost is offline  
post #1200 of 2682 (permalink) Old 08-29-2015, 12:21 PM
New to Overclock.net
 
PhantomTaco's Avatar
 
Join Date: Apr 2012
Posts: 1,263
Rep: 88 (Unique: 76)
By all means correct me if I'm wrong but there's a few things I don't understand. For starters are these theories based on the single Ashes of the Singularity benchmark? IIRC the game was developed with AMD helping the dev out. Would it be crazy to assume there were choices made that specifically improved performance for AMD? I'm not saying they necessarily actively made choices that hampered NVIDIA intentionally or even directly, but if true I'd assume some choices made would specifically benefit AMD while not helping, or potentially hurting NVIDIA hardware. Assuming this is all still based on Ashes alone, that's a single engine. There's at least half a dozen other engines out there that either have dx12 support or have it coming that are not necessarily going to behave the same way, so doesn't it seem a bit too early to draw any conclusions based on a sample size of 1?

PhantomTaco is offline  
Closed Thread

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off