# CPUs, TDPs and digital circuits

294 Views 6 Replies 3 Participants Last post by  Hephasteus
I'm curious about Thermal Dissipation Profiles on CPUs. Let's say that a CPU has a TDP of 100W - how much electricity does that represent?

I know that an analogue audio amplifier runs at around 50% efficiency, so 100W input means 50W output. A digital audio amp runs at around 90% efficiency (last time I checked), so 100W input gives a 90W output.

So, what's the efficiency rating with a fully loaded CPU? Does 90W TDP mean 100W of input power, or 180W of input power?
1 - 7 of 7 Posts
Unfortunately.... TDP is not a simple value. i.e. AMD and Intel calculate TDP completely different. In addition, processor efficiency vary depending on the work being performed. Furthermore, TDP is not the maximum power draw either. Basically, the answer is it varies....
A watt's a watt. You can convert it into heat calories or btu's or joules. 100 watts for one hour is 341 btu hours. 746 watt computers is like running a 1 horsepower engine all the time.

The sun at 1 solar power gives 1000 watts per square meter or about 3400 btu per hour per square meter.

A 90 tpd is the absolute maximum heat that the cpu can generate. Normally that would translate to about 70 watts under heavy load. It's powered by Alternating Current using a switching power supply which has a power factor. So at the cpu alone would consume 82 watts at .85 power factor under a 70 watt loading. The cpu would only consume 70 watts the extra 12 watts would have to be thermally dispersed by the power supply which means you would have to disperse 239 btu's of heat by the cpu. How well you disperse heat is related to the thermal resistance of the heat sink the heat radiating and absorbing characteristics of the local environment etc and the convective flows of air or water or whatever cooling medium is used. Which determines how the system estabilishes it's steady state heat flows to gain balance with environment.

But cpu's are little power supplys and in effect supply currents to many other parts of the computer so the total heat flow is spread out a bit. But once they start integrating things tightly into the cpu the total amount of thermal output a cpu is able to handle will drop significantly as it won't be able to rely on external flows much beyond i/o chips and memory subsystems.
See less See more
Quote:
 Originally Posted by Hephasteus A watt's a watt. You can convert it into heat calories or btu's or joules. 100 watts for one hour is 341 btu hours. 746 watt computers is like running a 1 horsepower engine all the time. The sun at 1 solar power gives 1000 watts per square meter or about 3400 btu per hour per square meter.
That's not his question.... no machine performing work is 100% efficient.
See less See more
Quote:
 Originally Posted by DuckieHo That's not his question.... no machine performing work is 100% efficient.
All heat machines are 100 percent efficient. Heat is the end result of power, work whatever. Heat is the bottom line so you have to understand heat before understanding work or power.
See less See more
Quote:
 Originally Posted by DuckieHo That's not his question.... no machine performing work is 100% efficient.
Indeed that was not my question. As DuckieHo states, a CPU is a machine that consumes power - in this case electrical power - in order to perform work, and a by-product of that consumption is heat. A CPU is not a heater, since that is not its primary purpose.

I used the example of the digital audio amplifier, since this is the closest example I could think of in terms of a digital logic circuit doing some kind of work, consuming energy and wasting some of that energy as a by-product of its primary operation.

I suppose the only way to measure what a CPU actually does in terms of power consumption is to use a reference system as a base line, using minimal RAM and a cheap (or on-board) video adapter and running a lightweight OS, and measure it loaded and unloaded using a Kill-A-Watt or similar device.
See less See more
As I said it's 100 percent efficient given 100 watts of dc power at the voltage it needs it at. It all gets turned to heat besides the computer work as heat is the way science does it's accounting. But there are efficiency drops both at your motherboard voltage regulation system be it 3 phase to 6 phase and efficiency drops at your power supply which causes power factor distortions on the AC line. So a 100 watt cpu with a 98 percent efficiency voltage regulation system needs 102 watts. 102 watts from the power supply at .85 power factor needs 120 watts from the AC line but the switching circuitry adds an even greater inefficiency beyond the power factor so could be as high as 128 or so watts to run that 100 watt cpu. But that is sort of blended in to the rating as my 45 watt cpu with a nearly 96 percent total efficiency power supply system only uses about 36 watts max. The 45 rating is already taking into account a .80 pf power supply and the big ratings on the big cpu's take into account that you need 4 to 6 phase voltage regulation to run them.

Soo a 100 watt cpu will use max 100 watts in less than ideal systems and usually uses about 90 watts with most people's systems.

The same thing is true of a digital audio amp. A class T amp delivering 50 watts average using root mean square method will take 50 watts from a battery but if you run it off AC with a power supply it could take 65 to 70 watts easily because it's harder to make an ac power supply be efficient for an amplifier because the current loads change so much and your power factor on the AC line bounces all over from .3 to .7 or so.
See less See more
1 - 7 of 7 Posts