Originally Posted by justanoldman
Question for anyone who can shed some light:
What causes more degradation to your CPU, voltage or heat? Obviously the two are related, but with the concept of delidding being introduced, it brings up the question.
When overclocking your chip you will eventually find the vCore that make your max core temp go higher than the comfort zone, but if you delid then you can all of the sudden send significantly more voltage to the CPU in order to reach a higher overclock because your temps are now measurably lower.
Even though your temps are much lower, your voltage is higher. So you could have one chip not delidded with a 1.3 vCore and 95c temps under stress, and you could have another chip with 1.4 vCore with 85c max temps. Obviously I am just guesstimating those numbers, but hopefully my point is clear.
Which of the two scenarios degrades the chip more, the one with lower temps but higher voltage, or the one with lower voltage but higher temps? I am not saying either won't be fine, just wondering which does more damage in the long run.
i thought for long time vcore would degrade a cpu more then heat,
doing some research i came across this article a few weeks ago,
The Truth About Processor "Degradation"
As soon as you concede that overclocking by definition reduces the useful lifetime of any CPU, it becomes easier to justify its more extreme application. It also goes a long way to understanding why Intel has a strict "no overclocking" policy when it comes to retaining the product warranty. Too many people believe overclocking is "safe" as long as they don't increase their processor core voltage - not true. Frequency increases drive higher load temperatures, which reduces useful life.
Conversely, better cooling may be a sound investment for those that are looking for longer, unfailing operation as this should provide more positive margin for an extended period of time.
The graph above shows three curves. The middle line models the minimum required voltage needed for a processor to continuously run at 100% load for the period shown along the x-axis. During this time, the processor is subjected to its specified maximum core voltage and is never overclocked. Additionally, all of the worst-case considerations come together and our E8500 operates at its absolute maximum sustained Tcase temperature of 72.4ºC. Three years later, we would expect the CPU to have "degraded" to the point where slightly more core voltage is needed for stable operation - as shown above, a little less than 1.15V, up from 1.125V.
im not very technical, but i think this says that high temps will degrade a chip more then vcore does over time,
glad i delidded and have good temps now ..lolEdited by VonDutch - 1/11/13 at 12:18am