If you game, it will change according to your resolution and to your GPU setup.
The lower the resolution, the more CPU dependant any 3d application will be. The higher the resolution, more stress will be put on the GPUs.
If you have a tri 580s sli stup, for example, it is a MUST to overclock your CPU as far as you can.
On normal systems and normal resolution, you usually see the best gains not in average performance, but the minimun FPS (don't quote me here, I might be completly wrong, as something on the back of my head says it might be the high FPSs
Appart from that, you can see the results on windows usage if you want to convert an album from FLAC to OGG, for example. Converting a huge MKV movie into a format supported by your ipod. If you zip, unzip. If you have 4 virtual machines running, if you are simulating stuff, like I do all the time for college works, if you are working with pictures like in matlab, cad or photoshop, etc, etc, etc.
So, you overclock it because it doesn't really draw more energy, because it will stand most of the time in idle, and there isn't a dramatic difference in that area.
Because everything is going to perfor faster, unless they are bottlenecked by something else.
Because you can do all I mentioned above at the same time with better results, AKA multiasking will show huge improvements.
Because if you are somebody like me that likes overclocking, to get the performance of an unoverclocked 2600k I would buy a cheaper CPU and overclock the hell out of it.
etc, etc, etc
This is my opninon, I hope anybody appreciates and respects it.