Arni's explanation a CPU bottleneck is spot-on. The reason that lowering the resolution increases the likelihood of CPU BN is simply because the FPS is higher at lower resolutions (because the gpu has an easier time with a lower res).
Ergo, at lower res, the CPU is expected to process more information in a shorter amount of time, increasing the probability that it will be the limiting factor to performance.
Also, the more cards you add in a crossfire set, the more additional CPU overhead is required simply for managing the output of more cards. It's not an insignificant change in workload for the CPU. However, I'd guess that managing the cards is not a multi-threaded process, so it probably adds a bunch of workload to just one 'core'. And if any cores get maxed out, you can get BN'ing.
Also, his comment about the single-threaded performance is generally very spot-on. There's VERY few games that you expect to ever see anywhere near 100% usage (esp. not on an 8 or 12 thread CPU) before a CPU BN kicks in. Most games only use two cores WELL, so in general you could easily start to get a CPU BN with only 25% usage on a quad with HT. However, BF3 is one of those very few games that's coded to use a lot of cores pretty nicely, so it's a bit of an outlier.
Lastly, people put WAY too much stock in CPU usage in Task Manager, and not nearly enough in doing the simple test of increasing the CPU speed by a certain %, and then observing the % change in FPS that results from it, and comparing those two %'s. The closer they are to equal, the more CPU BN'd you are at the lower of the two CPU clocks.
It's also important to note that the Task Manager and similar tools do NOT provide a 'bare-metal' analysis of what's ACTUALLY going on at the individual core level. What you see in such graphs is what the operating systems is TRYING to do with the load. The CPU, however, has it's own microcode that makes threading decisions, and as such, it doesn't function as a perfect little slave of Windows. The chip can and will override the OS's 'suggestions' (i.e. what you see in TM), and this will be invisible to you.
This is why it's so common to see a game APPEARING to have activity on four cores, but then when you coalesce all the usage onto one graph, you discover that only around 25% of your 8-thread CPU is being used in total, and it never goes above that (save for OS usage spikes or the like).
Often in this case, proper testing reveals show that you are actually CPU BN'd a mere 25% of usage. That's because, at the actual CPU level, only two cores are being used. The CPU is simply overriding what Windows is 'asking it to do', based on it's internal threading decision processes (and game code itself plays a big role in those decisions ... what Windows would 'like' gets disregarded).
So you should never place too much stock in the individual core usage you see in TM, as it really means next to nothing. The overall usage %, however, is relatively useful, once you determine how many cores/threads a given game is actually coded for Edited by brettjv - 3/14/12 at 11:11am