Originally Posted by FIX_ToRNaDo
This is the way I understand the subject.
Every piece of computer hardware will run at a "designed" wattage. In an ideal world, where electrical components have zero resistance to the flow of electrons ( a.k.a. electricity), they would dissipate no heat at all therefore they would require zero watts to run. Since there is resistance in all metals due to the fact that they do not have a perfect displacement of atoms in their molecular structure (crystals), when electrons travel in the medium they bump a lot more towards the other atomic/subatomic particles and produce heat (they transfer energy from the medium to the surrounding fluid, the air).
The temperature measured at the chip is an indication of the thermal state of the chip at the moment of the measurement, which depends on how efficient the system used to dissipate heat is. If a CPU dissipates 65W of heat at its peak, it will always put out 65W of heat, regardless of the cooling device it relies upon. If it's a good cooling device, it will "absorb" heat much faster and put it out in the surrounding environment. If it's not, there will be a heat buildup and the temperature measured will be higher. Still the heat output will be the same, 65W. If the cooling device is not sufficient to put out the heat, the heat buildup will reach hazardous temperatures and might melt the chip.
Which is why I stated that AMD can't create their own physics.
Sure, they created a processor that runs optimally at 95C. Riiiiiiiiiiiiiiiiight.
More likely, they don't want to spend money on a decent cooler. Wait a year, see how many are RMAing 290X's because the solder fell off the GPU.