Originally Posted by cdoublejj
I think, this may be relevant but, i'm not entirely sure, http://en.wikipedia.org/wiki/Quantum_tunnelling
My friend was reading on some forums way back in the day and read about something called "burn in" which essential was or was related to quantum tunneling, they used it to get unstable cards stable. they would over clock a video card (a specific model i believe) till it started artifcating somewhat bad then run heavy bench and games non stop for about a year and eventually the artifacts would slowly go away as the pathways in the chip wore down.
The simplest way to describe it is breaking in a new engine when the rings bearings beginning to slightly wear down and seat them selves.
Firstly, I too think that this is false. What happens when you run a card at high temps and voltage is degradation of all the pathways in the chip. That generally means narrowing of some connections and widening of others due to metal migration
. So this results in the destruction of some connections, while new connections might be formed elsewhere (buildup of metals causing widening of a connection until it touches another wire or something). Generally causing the chip to fail. I don't see how it could possibly cause the chip to stop artifacting. Artifacting is caused by incorrect calculations, resulting from individual transistors failing to switch fast enough and/or from the entire instruction failing to clock in time. In time means before the next instruction comes around.
Now regarding leakage and overclocking. Leakage is current that leaks through a transistor that is not currently switching. Transistors "should" only use current while they are switching. Leakage can also be current that escapes from the transistor through the substrate to the ground or power connection. The primary effect of leakage is to increase power consumption and heat. So that is the only effect on overclocking, as far as I know. However, leakage will get worse as voltage increases. As far as the limits of overclocking, I'm not sure that leakage current has much to do with that.
Suppose you have a chip with high leakage. If the chip was binned into a product with a lower stock clock, despite being stable (but hot) at a higher frequency, then you'll have a chip with better max overclock potential, even if it gets quite hot. So I could see where people could get the idea that leakage will result in better overclocking margins. That's really just with respect to the stock clock though. Leakage will not effect the maximum overclock of a chip (disregarding heat), as far as I'm aware.
This is just my understanding of things. Would be happy for correction.