Originally Posted by Serandur
That discrepancy caused by Gameworks is fairly consistent no matter how much additional GPU demand is placed on both cards by the resolution. Therefore using the discrepancy at 1080p as grounds for arguing that the 980 has excess CPU overhead is faulty. There's no supporting data from these tests for it.
It was your point, that it extends to all resolutions isn't against me.
As to your second point: considering the only thing that changes (390X positioning relative to the 980) in these charts across resolutions, any unexpected scaling issues seem inherent to the 390X; either with or without Gameworks. My whole point of showing the without Gameworks results, however, was to show something was up on AMD's end in this game regardless of what Gameworks does additionally.
No, your whole point was saying that it was 'higher cpu overhead on AMD'.
Also, I thought it was the common consensus that 390Xs were pretty much about on par with 980s at 1440p without Gameworks these days. Therefore a significant performance deficit for either card is unusual and indeed implies an issue. Ignoring the lack of evidence for a 980 CPU overhead problem and all the info floating around out there proving AMD to be the company with actual overhead issues and not Nvidia, why would a 980 with more severe overhead still perform so much better than the 390X at 1440p?
Such severe front-end limitations on an overclocked/aftermarket Hawaii (390X), even at 1440p, with a butt-ugly game like Fallout 4 sounds really unlikely to me, but sure it's a possibility. It's just that Fallout 4 has some horrendously low poly-counts as far as I can tell, Bethesda games have often had severe issues on the CPU side of things, and Hawaii's theoretical geometry throughput isn't usually considered an issue. They didn't test other cards than the 390X and 980 without Gameworks unfortunately (probably due to time constraints) so it's hard to get an idea, but I really don't see how the 390X could have a significant front-end problem with this game, of all games. CPU issues seem far more likely with Bethesda.
With high levels of tessellation, yes. Of course we haven't seen without, but regardless of whether there's a front-end limitation or CPU overhead one (or both), I've got a bad feeling about how the card would hold up without Gameworks.
The AMD cpu overhead came into prominence with eurogamer's publishing of AMD's draw calls deficit especially with lower end CPUs. Their recommendation to do away with it was an i5, much less the overclocked i7 that are used in these benchmark reviews. In fact, most of the brouhaha was about the fact that AMD cards will perform lower with real users than in the reviews because they don't test with low end CPUs which are common among the gaming population.
TPU's latest 980Ti lightning review shows the 390X basically on par with 980 at 1080p. So just going 'AMD cpu overhead' doesn't really cut it. Heck, even the eurogamer's analysis wasn't that clear cut, with low end AMD cards and not the price equivalent(with substantially beefier hardware) having similar performance degradation with i3 like the 750Ti and of course AMD have improved it since then.
As for why the 980 would do better while it's AMD who are said to have more cpu overhead is dependant on whether the game is GPU limited on both sides to the same degree.
We already have had a recent game doing poorly on Fury X to the point where it's level with a 970(as in this one),
So which side has the higher CPU overhead there?