Originally Posted by guyladouche
You can never improve a "bottleneck" (I'll rant on that term later) by increasing the demand on the hardware.
I hate the term bottleneck. You'll always have a limiting-component regardless of what you're doing. At low-resolution gaming (think like 800x600), the requirements are usually such that for each rendered frame, your graphics card can accomplish its task for that res. faster than your CPU can finish what it needs to do--so your CPU is the limiting component.
Switch that to the other end--game at 1920x1080, and generally it's your CPU that can accomplish what it needs to before your GPU--so the GPU becomes the limiting component.
But you cannot alleviate a bottleneck by increasing graphical demands, only (potentially) change which component is the limiting factor.
Also, keep in mind, that it's an archaic assumption that your CPU is "less important" at higher-resolution and higher-detail settings. That may have been the case 5-8 years ago given the graphical implementations, but it's no longer the case--CPU usage generally scales with resolution and various game engine post-processing methods. At best, you can only get the same FPS at higher resolutions as lower res. But more than likely you'll get much lower.
Actually, whether or not increasing gfx settings helps 'alleviate' a CPU BN on the GPU kinda depends on your own definition of the word. Certainly if one expects that 'alleviate' means you magically get higher FPS, then you are 100% right, they are going to be disappointed in that regard. It doesn't work that way.
AFA your last point goes, I'll offer some small clarification ... it really shouldn't be the case that the cpu load for a game 'scales' (i.e. increases) very much with resolution. Increased resolution should really not affect the CPU load OTHER than potentially lowering it, by causing the GPU to work harder and hence, 'slower' in terms of FPS.
However, the point that there are often 'graphics' settings that, in actuality, increase the CPU load as much, or even more than, the GPU load, is one worth making for sure.
Just for one example offhand, the 'physics' quality if a game has such a setting, is very likely to impact CPU usage more than GPU usage, and there's numerous other settings that one may find in various games that are like this as well.
Crysis 1 example has a number of settings that will increase CPU usage when turned up, such that Crysis on all Low requires minimal CPU effort (i.e. it's easy to get 100+ fps), but Crysis on Very High can also get CPU BN'd at much LESS than 100fps with the same gear ... so this general rule of increased settings = decreased CPU BN is not always reliable, it's just a general guideline.
So whilst, at least by one definition of the term, it's definitely possible to 'alleviate' one's CPU BN (or the phenomenon of the CPU being a limiting factor to FPS, at least on some percentage of the total frames rendered, if you prefer
) via increasing certain game quality settings, you do have to kind of have an understanding of how various gfx options impact both CPU and GPU load in order to do this effectively.
If you want a general guideline, increasing resolution and AA/AF levels for sure should not directly affect relative CPU load. Just about any other setting though, depending on the game, MIGHT.
And one should not expect that 'alleviating a CPU bottleneck' means 'moar FPS', because it most certainly doesn't mean that. In fact, it's the opposite ... the way that you get rid of a CPU BN is by increasing the GPU load, which then makes the game run SLOWER ... thus the CPU has an easier time keeping up with the demands for information.Edited by brettjv - 4/16/12 at 8:05am