Overclock.net › Articles › Understanding How Computers Use Electrical Power: A Brief Explanation

Understanding How Computers Use Electrical Power: A Brief Explanation

 

 

 

 

Please Note: This is a copy of the thread I posted in October 2012. It has been copied to the Articles section, as this is a more appropriate section for it.

 

Understanding How Power is Used in a Computer
Hello fellow OCN’ers! Due to a rather large number of threads that I have read recently regarding power supplies, various components and their power draw, and questions on power distribution, I have decided to do a writeup on the ways that power is used in a computer. For the sake of this article, I will be using 120volt United States standard electrical power, though the same principles certainly apply for those on 240volt electricity. Unless otherwise stated, "AC Power" refers to 120volt alternating current and "DC Power" refers to the lower voltage power that the computer's PSU outputs. So if you’re confused about 12volt rails and how that power gets from 120 down to those voltages, read on!

So let’s dive in.

One way of looking at a computer (certainly not the only way) is a device that converts electrical power and user inputs into outputs of some kind. Almost all desktop computers and servers that we encounter here on OCN or in the workplace operate on electrical power from the wall which is then “stepped down” (or we could simply say “converted”) from 120 volts at the wall (even though it’s normally closer to 110, but that’s a topic for another time) to the voltages that a computer needs to run on: 12volt, 5volt, and 3.3volt. This is a power supply’s primary function: to convert line-level power (that is, 120 volt) to lower voltages that a computer can use.

“But why don’t we just make stuff run on 120 volt inside of the computer so there’s no need for a power supply?” you may ask. The answer is the delicate nature of the components we’re dealing with here. Most processors run on less than 1.5 volts, and pushing further amounts of power through them will often destroy the component, especially if done for a long time.

So it’s important to get the voltage down to 12 volts for the system to use by using a Power Supply Unit (PSU). But why? What is voltage anyway?

Voltage is otherwise known as “electrical potential difference.” One incomplete but useful analogy for electricity is comparing it to water. Let’s work with these analogies, like water flowing through a pipe. When power comes out of your wall, it is coming out with the metaphorical force of a fire hose, and would destroy anything it came into contact with in your computer. However, the power supply acts like a wall that the incoming blast of water hits. This takes away a lot of the energy of the electricity and allows it to be used by the components. The power supply takes a small amount of power with a HUGE amount of energy per electron and converts it to a HUGE number of electrons with a lower amount of energy per electron. (All electricity is is a bunch of moving electrons)

Notice that the total energy is essentially the same. For a simple example, let’s say there are 10 electrons coming into the PSU per second at 120volts per electron. The PSU will take down the voltage to 12, 5, or 3.3 volts per electron. If we consider the 12 volt electrons, they have fallen from 120 volts to 12, so they’ve dropped in energy by a factor of 10. The PSU will then send out 100 electrons per second at 12 volts to the various components of the PC. (note: electrons per second = amps. We use different units (such as the Coulomb) to actually define amps, but for this example, the comparison is a fair one)

“Hey! My PSU says it’s 80% efficient and you’re telling me that 90% of the power gets wasted??”

Good question! Efficiency is indeed a big selling point of PSUs, so let’s take a look at this. In the example, notice that 100 electons*12volts per electron=1200, and 10electrons*120volts per electron=1200. So the amount of energy is the same. You haven’t fundamentally lost anything. In this example, we have a 100% efficient power supply. Note that this is electrically impossible without Superconductors, and some will always be lost as heat.

“Wait, did you say that 10 electrons per second go in, but 100 per second come out? How is that possible?”

Yes, that’s a little bit confusing isn’t it? Without diving too deeply into this, the PSU uses something called a transformer. A transformer uses electrical and magnetic fields to change the power to a certain voltage. A principle called Electromagnetic Induction is used to move electrons in a wire next to a wire that itself has electrons flowing through it. Yeah, you don’t want to go there… it’s a little too indepth for this guide. However, do not fear! A fellow OCN member has come to the rescue. I am, with his permission, going to quote him in this article:

 

Originally Posted by PR-Imagery: 
"The simplest way to word this would be, that when you step down the voltage, the current(amount of electricity) goes up or in the case of your analogy the number of electrons. You half the voltage, you double the current.

So the power supply is drawing 120v at say 10amps (or electrons), the transformer steps that down to 12v at 100amps (electrons).
Here we've dropped the voltage by a factor of 10, which would multiply the current/ number of electrons by 10."

 

Today’s power supplies are FAR more efficient than past designs, and are called “switching power supplies.” For this guide, I won’t go into all what that means, because THAT’S a whole nother can of worms. If you ARE interested, BSLSK05 has posted a link that explains them in depth more.

 

SO WHO CARES??



Another good question! You seem to be full of ‘em today.

After the power has been stepped down to 12 volts by the PSU, it is fed directly into the processor for use, right?

NOPE!! That would kill your processor in about 1 microsecond. (really short period of time! biggrin.gif )

So we have ye olde VRM’s. A VRM is a voltage regulation module. Essentially, it’s another point along the electricity’s journey to your processor. These little guys work in teams (usually 4 or 8 at a time) to step the voltage down again from 12volts to somewhere in the neighborhood of 1.2 to 1.5 volts, depending on how you have your BIOS set. These guys, however, do not have the benefit of a large cooling fan and a spacious enclosure with big components like your PSU. And as such, they get HOT. Like friggin’ ridiculously hot. devil.gif Like burn off your finger and-

Well, maybe not that hot. But they still get pretty hot. thumb.gif

Graphics cards have VRMs on them too. They are fed 12volt power through the PCI 6-pin and 8-pin connectors which they then step down to about 1 volt. Again, overclocking can change this to a different value, but do so at your own peril! (Unless you have an Nvidia 600-series card, and then you’re SOL when it comes to changing voltages.)

For individual processor power consumption, I’m going to quote from an OCN thread several months back:

Article about new Intel Processor: Link here

Quote:
When in 'turbo' mode, at these speeds something of a relative concept considering the top speed of the NVT process is under a gigahertz, the chip runs on just 1.2V.

OCN forum member:

Quote:
Does that mean it takes 1.2V to power up to 1GHz? That doesn't make much sense since it only takes 1.325V for me to power my 2500k at 4.5GHz. Maybe it has something to do with the Pentium architecture?

My answer:

Quote:
That article is quite....vague when it comes to actual power use. To determine Power use, we can use Work per second, or in the case of electronics, Volts multiplied by Amps.

So, a chip that draws 100 watts of power (remember, volts and amps BY THEMSELVES are not important when it comes to reducing power consumption.) with a VCORE of 1v will draw 100amps of power. Simple right?

Or a chip that draws 1.5volts and 50 amps will use 75 watts. Got it?

In order to determine power usage, we need volts AND amps. The article neglected to put that in there. At a guess, I would say that the 1.325volts of your 2500k is likely at a load of 70 amps or more. These chips? Maybe they only draw an amp or two at 1.2 volts, meaning just a couple of watts.

Part 2: Other components, TDP, and AC to DC conversion

Due to some very helpful and valid feedback from fellow forum members, I’m making an addition to this guide. Thanks for your kind words, guys. thumb.gif

So in part one, we covered the fact that a PSU converts power from 120volt line level down to 12volt so that a computer can use it. Now let’s look at how the computer’s other parts use that power.

The two major components in a computer that consume power in standard desktop computers are CPUs and Graphics cards. Both of these are, as we previously saw, powered by the 12 volt rail. This voltage is then down-converted to somewhere between 1 and 1.5 volts for these components to use by the VRM’s. However, not everything in a computer runs on the 12 volt rail. Hard drives require most of their power from the 12volt rail to spin the disk around, but also they use some of the 5v rail to move the read/write head. (Thanks to Phaedrus for this tidbit) smile.gif The 3.3 volt rail isn’t used for much anymore. Some SSD’s run on the 3.3 but these are quite rare.

At this point, we must stop and understand an extremely crucial law that dictates how power works: The First Law of Thermodynamics. It states, for our purposes, that energy (in this case, electrical power) cannot be created or destroyed. All power that is fed into a computer is converted into work and heat.

I’m going to stop and say that again because it is incredibly crucial to understand: ALL power that is fed into a computer is converted into work and heat. (The VAST majority goes to heat.)

Let’s take the processor (CPU) for example. If I have an Intel Core i5-2500k that is under load and consuming 80 watts of power, then 99.9999999% of that will be eventually converted to heat. And here’s why: The processor is not moving anything anywhere and nothing in the way of work is happening. All that is happening is billions of tiny switches are firing back and forth consuming power and creating heat at the same time. When you run your computer for a few hours and then shut it down, nothing has fundamentally changed. Let’s say you played a video game for awhile and used your processor (obviously) in that game. It has completed BILLIONS and billions of calculations but PHYSICALLY nothing has changed. The GPU is doing the same thing, as is the RAM and all the chips on your motherboard. They are calculating, storing, changing, charging, and manipulating data. However, it is all lost as heat eventually.

“AHA!! WHAT ABOUT THAT 0.00000001%!!”

I knew you were going to bring that up…. Some of the power DOES get lost in other ways as chemical energy (various cables may oxidize or things like that), but the OVERWHELMING majority of all the power is turned into heat. Heat heat heat, that’s where almost all your power goes.

Some exceptions:

* Hard Drives are spinning a disk that DOES lose energy as heat, but also must overcome air and mechanical friction
* And fans in the system convert electrical power to mechanical energy to move air molecules, and pumps move water around a water cooled system.
* Still, these draw maybe 5-10% of the power in a system, and only that much in an extreme case.
* Lights in the system convert electrical power into EMR (Electromagnetic radiation) but even they lose most of it as heat.


TDP

A quick note on TDP. TDP stands for “Thermal Design Power”. Let me say this once and only once…
TDP IS NOT THE SAME AS ACTUAL POWER CONSUMPTION

Okay, glad I got that out of my system. TDP is NOT the component’s power consumption. TDP means the maximum amount of heat that is expected to be generated from this component. In short, it tells manufacturers of cooling solutions how good their coolers should be. It’s a ballpark estimate of the highest amount of heat that could, under reasonable circumstances, be released from a component. For example, Cooler Master certifies the Hyper 212 CPU cooler up to 150 watts TDP (Link). That means that Cooler Master has said that their cooler is able to remove 150 joules per second of heat from a CPU that it is attached to, or in other words, it can in theory handle a CPU that is converting 150 watts into heat every second.

So please, let’s not assume that a 95 watt TDP CPU will use 95 watts. Now, when we overclock components, (cuz that’s how we ROLL here on OCN!!) it can really cause the heat to crank up and all bets are off when it comes to what the manufacturer says. devil.gif When Intel says a chip has a 95 watt TDP, they certify that at the max STOCK turbo frequency of the CPU. The 2500k we looked at earlier turboes up to 3.7GHZ, and Intel is saying that that CPU needs a cooler capable of moving 95 joules of heat away every second. (Also, for reference, a watt = 1 Joule per second). So enough on that point.

AC to DC Power conversion

Earlier in the guide, we explored a bit into the fact that PSU’s convert 120 volts down to 12, which is then stepped down further to about a volt or a volt and a half for the CPU and GPU by their respective VRMs. But what else is going on? All electricity that we come into contact with on a daily basis is either AC or DC power. These stand for Alternating Current and Direct Current. Their differences are many, but we’re going to focus on just a few of them today.

Alternating Current is what the Power Company produces. It is what comes out of your wall at “line level” or 120volts. Alternating current does not flow in a set pattern. In the United States, this power changes direction 60 times per second, or 60hz. The advantages of AC power is that it is VASTLY more efficient to send over long distances. Very little of it is lost as heat in the lines as it comes to your house compared to DC voltage. AC is also far more dangerous and caution should be heavily exercised when tampering with AC wires. On second thought, unless you KNOW what you’re doing, just call an electrician. Please? smile.gif

DC Current is simpler than AC. It does not cycle at all and is a steady, constant flow of electrons in a single direction. Computer components cannot run on AC power and thus require DC current to work. The transmission of DC power is extremely inefficient over long distances, but the power lost from your PSU to your components (maybe 3 feet max of cable) is essentially negligible. For reasons that go beyond the scope of this guide, PSUs have to provide DC current for the components to operate. Using transistors, capacitors, and more transformers, they are able to accomplish this task. Suffice it to say that the sensitive electronics inside a computer need it to be constant, and not cycling back and forth, in order to operate. I am going to post a picture here that I shamelessly stole from a blog I found.



Source Link

As you can see, DC is flat, while AC is sinusoidal.

Okay, folks, that does it for part two! Part three will include topics such as ripple and PSU efficiency. That’s all for now!

So I hope this guide has been of some use as far as what’s actually going on inside of our little boxes of magic. biggrin.gif If you have any questions, objections, or things that should be added, please post them in the thread. Thank you for your time in reading this. applaud.gif

 

Comments (7)

re-submitted after correcting two spelling errors and adding a bit of hearsay at the end.

Nice article but being a pedant I have to point out a couple of things.

"The advantages of AC power is that it is VASTLY more efficient to send over long distances" that was never true (AC transmission suffers from inductive and capacitive losses plus resistive losses while DC transmission only has resistive loss). Up until recently the transforming of very high voltages and currents used in long distance DC power transmission was prohibitively expensive, but now it makes economic sense for a few very long high power lines. It is still a very expensive option but due to the lower loss it is worth it in these few cases.

"AC is also far more dangerous" despite what T. Edison tried to make us believe this is not true.
In near lethal situations you have a slightly better chance of releasing your grip on a live wire if the current it carries is AC rather than DC (of the same voltage). I suspect the author is using AC as a euphemism for 120V AC. I have been told (just hearsay of couse) that 120V AC is more likely to produce heart fibrillation than 240V AC (the current being double is more likely to clamp the heart than send it into fibrillation, which in the short term is preferable but still lethal if sustained).

Best regards
Informative and interesting.
Great guide, very informative.
Very nice work!
I think it would be better specify that VRM are generally called power phases.
I'm very intrested in this article, waiting for part three!
Coming from a novice, I enjoyed the article. I will reading again and again.
Thanks for your work..
Awesome read, thanks.
Overclock.net › Articles › Understanding How Computers Use Electrical Power: A Brief Explanation