Originally Posted by Swag
I'm guessing you haven't seen the heat probes they use to accurately detect the temperature for heatsink benchmarks. It is possible and they are very common. The reason why we say it is inaccurate is because do you see an actual wire on top of your CPU or GPU? No, it reads it via software. Software is usually inaccurate. Just like how CPU-Z voltage readings are normally inaccurate, we have voltage meters to accurately see how much is actually being directed towards the CPU.
Voltage measurement with a multimeter is different than measuring with a heat probe. The points that you measure on are accurate because it is a central point where the electrons flow through, hard to get that accurate using software. Also, software as well as the bios takes vdroop in account, multimeter shows the raw voltage being put across. Heat on the other side is different because it measures the temperature inside the core/die, impossible to do externally. You might put a probe on it but how would you do that using pro, you have to be precise when applying pro, how is putting a probe between a core and a block when the actual probe is thicker than the thermal paste layer a viable option.
What you probably mean is the actual software like realtemp or hw monitor is not 100% accurate, and I can see that. However, the temperature that the probe inside the cpu or gpu core gives out is as accurate as you can get. Putting a probe between a die and a copper block probably has the same % error as the inaccuracy of the program you use to measure the temperature.
To sum it all up, my .5c delta temperature is valid, low heat coming from the gpu which gets transferred to my water. Also remember my 12inch gtx 690 and it's centimeter thick copper water block has much more room to dissipate heat than the small cpu. On full load it is a different story, you would need much higher tech gear to get a delta of .5c at 100% gpu or cpu load, but on idle, it is possible, and I have proved that using liquid pro.