You can never improve a "bottleneck" (I'll rant on that term later) by increasing the demand on the hardware.
I hate the term bottleneck. You'll always have a limiting-component regardless of what you're doing. At low-resolution gaming (think like 800x600), the requirements are usually such that for each rendered frame, your graphics card can accomplish its task for that res. faster than your CPU can finish what it needs to do--so your CPU is the limiting component.
Switch that to the other end--game at 1920x1080, and generally it's your CPU that can accomplish what it needs to before your GPU--so the GPU becomes the limiting component.
But you cannot alleviate a bottleneck by increasing graphical demands, only (potentially) change which component is the limiting factor.
Also, keep in mind, that it's an archaic assumption that your CPU is "less important" at higher-resolution and higher-detail settings. That may have been the case 5-8 years ago given the graphical implementations, but it's no longer the case--CPU usage generally scales with resolution and various game engine post-processing methods. At best, you can only get the same FPS at higher resolutions as lower res. But more than likely you'll get much lower.Edited by guyladouche - 4/13/12 at 3:55pm