Originally Posted by brettjv
1. The games you listed are all pretty CPU-intensive so upgrading the CPU will likely help on those particular games.
2. However, a consistent 90-99% GPU usage, esp. on 2 high-end cards, is actually quite good.
3. You likely won't always get 99% GPU usage no matter what your CPU is.
Which is to say, <99% GPU usage is often, but is by no means always, due to an 'external' bottleneck like the CPU. Remember that your video card is not a monolithic entity. It's actually nearly an entire other PC ... within your PC.
When you measure 'gpu usage', it is in many ways only an approximation. It's a measurement at one point of the rendering chain, the one most LIKELY to act as the 'limiting factor' to performance, but ... there are many OTHER points in the rendering chain (in the card as a whole) that can act as a bottleneck, and lower your 'usage %', such as by maxing out the capacity of your ROP's, your Memory Bandwidth, your Memory Amount (if not 'enough'), your shaders, your texture units, etc. IOW, there's no such thing as measuring the 'usage' of your entire card.
Oftentimes, <99% GPU usage just reflects poor coding (remember the poor optimization of Skyrim when it came out, causing an massive CPU BN in certain spots like Whiterun?), or a bad driver interaction with that coding, or a bottleneck in a part of the card that is NOT ... the point at which you're taking the 'gpu usage measurement'.
Lastly, also remember that CPU usage measurements are probably even worse in terms of being accurate. They're not a 'bare metal' real-time measurement, they're more like software-based (i.e. Windows) approximation of usage.
The OVERALL usage is reasonably accurate, but what you (believe you) see 'happening' on each core ... means jack squat. And in order for even the overall usage to mean something, you have to know how well the game code runs on multiple threads in order to be able to (maybe) reckon whether your cpu is acting as the bottleneck.
A good way to see if you're being cpu BN'd is to go to a particular area of a game where you consistently and reliably see <99% usage, and just stand there, while running fraps for a minute or so, then check your FPS afterwards. Then, OC your cpu, and do the exact same test. Does your FPS rise with the CPU clock? What's the ratio in terms of % change in FPS / % change in CPU clock?
The more that ratio approaches 1, then the more heavily you're CPU BN'd. If there's no increase in FPS, then your CPU isn't the issue. But remember this test ONLY applies to the exact spot in the exact game, at those exact settings. It's not in any way 'universal'. But within the SAME game, it's somewhat reasonable to assume that all instances of <99% usage ... are being caused by the 'same thing'.