Originally Posted by Mahigan
One thing, which I know will be controversial as a statement, is that everyone is looking at Fury-X and scratching their heads... I'm not. I expected as much when I made my statements in the Ashes of the Singularity thread. I hate to repeat this line of reasoning but I feel is it warranted.
Fury-X will have a longer shelf life than a GTX 980 Ti. I don't care if you don't believe me but it is nonetheless the truth.
Right now around 20% of a Games engine occurs in Compute. This trend has been growing, in small increments, for some time (hence why a 290x went from being 10% slower than a GTX 780 Ti to 5% slower and now with DX12, far faster).
DX12 will accelerate this process, moving game development away from the traditional Graphics pipeline and into the Compute pipeline. The next iteration of the Nitro engine will be 50/50 (Compute/Graphic). While the Fury-X holds its own as is right now, its front end is its weakness (Graphics). As game Engines move towards compute, it will flex its muscle. This isn't an "if" statement but rather a matter of certainty (everyone who doubts me should listen to what the Vulcan and DX12 programmers and devs are saying and the white papers they're releasing).
We have another example of a 290x with the Fiji lineup. At launch, appears to be underwhelming but with time shows its merits.
As for Arctic Islands, I have no doubt that AMD will re-work the front end of that architecture but I think they'll boost the compute efficiency of the GCN architecture. I wouldn't be surprised to see it contain less ALUs than Fiji yet more efficient ALUs. I just don't see the viability of scaling more and more Compute Units without improving the efficiency of each ALU which permeates the CUs.
We will find ourselves, 2 years from now, having the same conversation we're having with the 290/390 series (Hawaii/Grenada) but this time in relation to Fiji. By this time nobody will be discussing the GTX 980 Ti, same way nobody is discussing the GTX 780 Ti.
I don't think this strategy is paying off for AMD though. Not because it isn't obvious to anyone who understands GPU architectures and the direction of the industry (devs) but rather because of the short sighted fixation with benchmarks and bragging rights amongst PC enthusiasts. Most enthusiasts are looking for short term gains rather than having the required knowledge to hedge their bets over the long-term. Most enthusiasts also get "bored" with their GPUs and always want something newer (like an iPhone consumerist).
As for me, I am a recovering consumerist. I'm looking to spend a decent amount of money for something which will grant me the best returns on my investment. As it stands... nVIDIA doesn't fit the bill. I, however, understand that I am in the minority having worked in the industry.
It's all about your personality. The choices which you make reflect your personality. I'm not about to hate on anyone for their own choices (provided those choices don't end up hurting consumers).