post #11 of 11
Quote:
Originally Posted by Chunin View Post

My point isnt whether the voltage i have to use for given OC is better or worse than someone elses but the jumps in voltage you see when trying to go for higher clocks. I mean at which point does it stop being worth it to OC more. If you have to add another .015V or more voltage to get just a 100 Mhz boost at which point your are probably at 1.4V or whatever and your temps start to get extremely high. Id say that for most people 4.5 Ghz is the sweet spot between performance gain and heat generated with voltage added.

If you cant cool the cpu well, OC'ing ISNT worth it at all.
When you have sufficient cooling, and can keep the voltage under control, you are golden.
The real world gain comes into play based on the intended purpose of the overclock itself. If you are overclocking for 10 more frames in a game, the extra heat and voltage arent worth it imo. When you start to overclock because you are rendering and the actual process can be sped up through higher clocks, then yes the heat and extra voltage is worth it.

What it comes down to is your comfort-ability. When you aren't comfortable pushing an extra .030 v, is when you shouldnt be overclocking..