Originally Posted by Charcharo
The costs are much higher in Bulgaria. So I would be paying price premiums for cards with terrible price/performance metrics.
I give my old GTX 760 a lot of crap, but to be fair it still games REALLY well at 1080P. Always above a console either in IQ or in FPS (or both), whilst keeping PC Gaming heavy hitter pros.
And the old 5770 lasted me 6 years of playing even new AAA games as well (too bad most AAA games are bad though
Right, so like I said, if you can't afford to upgrade your top of the line card each generation it makes more sense to go with a mid-range card for the better price/performance ratio.
Originally Posted by superstition222
DX11 tests. The context of this topic is "DX12 and asynchronous shading".
Sorry, I forgot about all the DX12 games that AMD users are currently enjoying.
Point is that people here are comparing the AMD cards like the 290X/390X to the 980 Ti in DX12 when the AMD cards were made to better utilize those features. AMD tries (and fails) to predict the market in both CPU and GPU divisions. They are their own downfall and they don't even know how to play the market. Like other posters have commented, "[Nvidia] get 5% or so advancement in performance and are all but abandoned when next gen releases on Nvidia side". That's how you play the game, like when Apple releases a new phone and a new OS which runs like crap on older phones. Gives people more incentive to upgrade. I'm sure that's exactly what AMD wants - users to keep their old ass cards and not buy new ones from them.
Originally Posted by f1LL
I don't think that better/worse support is the cause for this phenomenon. My interpretation is that it is due to the fact that AMD tries to build for future technology and Nvidia tries to max performance here and now. My guess would be that AMD cards are usually not utilized to their full potential at release because of hardware optimization for things to come, while Nvidia cards are already more or less min/maxed at launch.
Same with their CPUs being beasts at anything that can utilize 6-8 cores. Unfortunately, not a whole lot of applications utilize that many so for real-world usage, it ends up just being a slow CPU that gets eaten by Intel's offerings.Edited by xxdarkreap3rxx - 2/26/16 at 10:01am