I'll have to disagree with the above posters that running a chip at 5.0ghz will cause it to fail in 2 years max. Why overclocking damages a CPU is a bit more complicated.
Why does OCing dmg a CPU?
Explaining this requires delving into a bit of Quantum Mechanics, but I'll keep it simple. You'll just have to take my word for some of the concepts though =P
Normally, electrons stay around their atom's and don't go wandering off. So in a CPU, they'll stay in one transistor and not move to others. However, at the quantum level (objects smaller a quanta which is 6.626068 Ã— 10-34 m2 kg / s aka planck's constant) it's actually possible for electrons to escape from energy wells when given enough energy, even infinitely deep ones, it's just very uncommon. As a result of this, in a process known as quantum tunneling, electrons can pass through solid matter and be ejected out the other side.
Now, a transistor in a CPU is made from alternating + and - doped and undoped silicon. Once in a while, an electron will escape and bury a couple atoms into an adjourning transistor, and if this happens enough times, eventually all the way through to the adjourning transistor before coming back to it's orbit.
Keep doing this and eventually an electron doesn't come back, but stays attached to an atom in the adjourning undoped section of silicon. Over time (usually years), this tunneling causes a hole to be formed between two adjourning transistors and allows free electron flow.
This bypasses the "gates" between the transistors and as a result, the computer will misread this resulting in an error. Open and closed gates are how a CPU determines if something should be read as a 1 or a 0.
This process is called silicon degradation and eventually results in a complete CPU failure.
Now, as to where overclocking comes in.
If you know about electron orbital theory, the more energy an electron has, the more likely it is to leave it's orbit and tunnel. IE if your CPU is running hot, or has a considerably higher voltage going through it, electrons tunnel in much higher numbers. As a result, the more you OC, the faster you make those tunnel which cause silicon degradation.
In addition, if you increase the voltage enough, you can actually physically destroy the silicon lattice of the gates within a processor. Basically what this is think of a guy throwing a ball at a wood door. If he throws it at 20mph it'll probably just bounce off. However, if you throw it at 80mph, you might just break right through it. Increases v does a similar thing with electrons, shooting them through the CPU with greater force.
Now, on to OC and Heat
In a CPU boosting F, has a very minor, almost insignificant heat increase.
It's v increase that dramatically increases heat.
Power Dissipation = PD in Watt
Voltage = Volt
Freq = Hz
C= Capacitance in Farads
Total PD in Watt = C x F x V^2
As C doesn't change (ok it technically does, but for the sake of keeping the math simply we can assume it doesn't)
If you actually plug in numbers and graph the function, the heat increase due to a freq increase is minute compared to the heat increase from a v increase, as one increases exponentially, the other linearly.
Indeed, the more you increase the V, the less the F part of the equation is relevant to the total temp.
Looking at real world data, look at the power usage increase in Tom's i5 efficiency article.
Each bump was a constant 10mhz clock speed increase, but due to the exponential nature of the voltage increase contribution to PD, the graph is not linear, and power usage does not increase until you start seeing large v increases.
Power usage directly translates into heat.
As for actual temps, it's more complicated than purely based on power dissipation
Cpu temperature = (Total PD in Watt) x (HSF's Thermal Resistance in
C/W) + (Ambient Temp in Celcius)
For comparison purposes the resistance and ambient can be considered constant (technically not true once again, as resistance changes slightly with temp, and ambient increases with more heat output).
In your specific case, the answer is not so much what clock speed you reach while OCing, but how much v increase you'll need to attain it. If there is no v increase, life of the CPU will be minimally impacted.
There is no easy way to tell how each chip is affected as due to imperfection in the manufacture process, the degradation rate vs v or f graph would be unique to each chip.
However, a big factor is temp of the CPU and v used, both of which increase the rate of electron tunneling. A 5.0ghz OC at below ambient and say 1.4v will last much longer than a 5.0gz OC at 80C and 1.5v.
To get some actual approximations, you'll have to consider the fact that Intel designs its CPU's to last ~10 years at the Tcontrol value (was mentioned by Intel several years back, not sure if the current gen of CPU is much diff), which is the temperature they strive to keep the CPU at. This temp is MUCH higher than most enthusiasts will tolerate for their CPU's.
I'm not sure exact value for sandy bridge, but it's ~68C. It's somewhere in the thermal specifications data sheet if you want to dig for exact value.
Now, assuming you cool you CPU to the Tcontrol point, as long as their is no v increase, the lifespan will be minimally impacted. The more voltage you add, the shorter the lifespan is. However, I doubt even a 1.5v or higher OC if kept at the Tcontrol point will last less than 5 years based on what I've seen from Intel's own data. For most enthusiast's 5 years is more than they will use a CPU for.
For some more details, here's the powerpoint of a Tcontrol presentations Intel gave a few years back.