Originally Posted by jason387
Most games now days are never well optimized. A GPU is capable of a lot but the raw power needs to be harnessed. Moreover, games should come out with some constancy. By that I mean a balance of the cpu and gpu usage. Many games make use of the cpu or gpu more in certain levels and fps fluctuates. DX11 at it's best is still yet to be seen. The best I've seen it is in benchmarks but never in games.
You don't have a background in programming or game development, do you?
It is very hard to optimize a game when you have to account for 8+ GPU architectures, different core counts, different memory widths/sizes/speeds, a magnitude of performance scaling, 7+ x86 CPU microarchitectures, different memory controllers, etc.
Why would any PC developer attempt to balance the CPU and GPU usage when these vary? Besides, overclocked CPUs have not been the bottleneck in most games for years now.
Originally Posted by Dromihetes
Another useless version like DX 10.1 or even DX 10 ,that started to be used more when 2-3 years have passed.
Just to promote the Windows 8.1 fail subversion.
AMD will not promote it unless they can do it ,thus almost no games will use it until then.
The benefits in the minor releases are mostly direct implementation of transformations and optimization. Most users won't notice the difference.