Originally Posted by billbartuska
Did you look at "DETAIL A"?
"Seems all of the better cooler reviews use thermocouples in the IHS of the CPU to measure temps."
Intel/AMD have specs for a dummy CPU heat source (with thermocouple) for use in evaluating various thermal solutions. These are used by aftermarket heat sink manufacturers to evaluate their products, and I assume testing labs also. The problem is that you have to sign an NDA just to see the plans, much less buy one.
So, what's wrong with CoreTemp?
Not quite sure what you mean. The scale means that it's 20x larger on the drawing than the rest of the drawing, so that you can make out what the dimensions are referring to. The only dimension there is 0.040, with no spec'd tolerance, but if you look at the title block of the main drawing, it says .xxx is +- 0.010", of +- 10 thou.
A ten thousandth of an inch isn't 10 x 1 thou, its 1 thou over 10, or 0.0001".
Coretemp doesn't really work for me, I'm pretty sure that my sensors are screwed. Comparing my heat numbers at 4.05 ghz, 1.36v on an E8400 (a pretty common overclock) with a Noctua NH-U12P gives me about 70* load with an OCCT test, above 80* with Intel Burn Test. Other people using that exact same setup and OC usually get 50-60* load. So, either the sensors are wrong on my chip, or they are right and the IHS isn't installed right on my silicon. This is also after about 20 different TIM applications, with different methods and brands.
The more I think bout it, the more I want to sand thru the IHS to the silicon. Because using realtemp's sensor test shows that both cores are within a few decimals of a degree of each other through the entire load range. So the only way for that to be possible with bad sensors is for both sensors to be bad, and both equally as bad. Pretty unlikely.