Originally Posted by se7en56
It takes a huge OC to affect a video card. Did for me anyways... they're fast anyways, so making them faster is hard... and it's not a linear improvement either... kind of like a logarithmic one
Not to be nit-picky, but technically a logarithmic improvement would be even better than a linear one
I would also point out that whilst its quite rare, there ARE actually conditions wherein one will literally see performance scaling exactly linearly from overclocking. It all depends on whether raw clock speeds compose the entirety of the limitations on performance in the particular scenario. As I say, this is rare, but entirely possible.
Although this equation is not 100% 'card-dependent', with certain cards it's far easier than with others to achieve linear scaling. The 8800GTX makes for a good example, since it has a very wide memory bus, lots of vram, lots of shaders, lots of ROP's. IOW, the 'other things' that COULD be limiting factors are pretty damn powerful, and thus unlikely to be actual limiting factors. This leave clock speeds to be the primary mechanism for performance improvement, and the 8800GTX does in fact exhibit very close to linear performance scaling in a lot of test scenarios.