Originally Posted by sugarhell
Heat is the same as a titan/780. 30 watt difference means nothing. Why you believe temps=heat?
Thermodynamics tells us that energy cannot be created or destroyed. Therefore if GPU A draws 30w more power than GPU B, then GPU A is going to require 30w more of cooling and is going to put 30w more per second into the atmosphere than GPU B.
How hot the card runs is irrelevant to it's power draw. While higher-drawing cards DO require a more advanced cooler, ASIC temperature has no correlation with power consumption. All of these are loosely connected and conclusions can be draw based upon the inter-relations.
Higher power draw leads to:
More heat dispersed into the atmosphere regardless of cooler type
Higher operating temperatures given that whatever you're comparing it with has the same cooler
Shorter life span for the device itself (though HOW much shorter, only AMD knows.)
Increased fan noise assuming the same cooler is used
Again, these statements only fit in a scenario that adheres to ceteris paribus
. The card will run hotter IF
the same cooler is used on it and its competition and IF
it's in the same case/fan setup. A 10 watt Celeron runs hotter than my 300w GTX 480's if you place Play Doh on the Celeron and a waterblock on my 480's. Just because one runs hotter does NOT mean it draws more power. But it can be an indication that that is what is happening.