The proper way to determine the degree to which the CPU is acting as a bottleneck is to run a benchmark at your current clocks, then OC the proc and run the benchmark again. Observe the % change in FPS that resulted from the CPU overclock, and divide this by the % change in clock speed.
If you observed a 10% FPS increase from a 10% increase in CPU clocks, then you have a large (as in, probably all the frames rendered were being CPU-limited, and/or all those that were cpu-limited were severely so ... keep in mind we're dealing with averages here) bottleneck.
If you got a 1% increase in FPS from a 10% increase in clocks, then you have a very small (infrequent, and/or not severe in terms of all the frames that were rendered during the test) CPU bottleneck.
Note that whatever the level of BN, it describes the situation at the lower of the two CPU clocks.
But as has been observed upthread, overclocking the q6600 does mostly nothing afa the GPU score on 3dMark11, but does a lot to affect the GPU score on 3dMark06. The same sort of variances will apply to everything you ever run.
Ergo (there's that word again ) whatever you observe will be be entirely specific to the particular test at hand, i.e. what application (bench/game) were you running (which affects both the level of CPU dependency, and the absolute workload on the GPU), and what settings were you running it at (which only affects the absolute workload on the GPU).
Does this all make sense?