(I started this thread a few days ago in order to learn more about appropriate CPU/GPU power for a very basic PC. After not having recieved much feedback and with the purchase date rapidly approaching, I began to thik of this question in different terms: power efficiency over lifetime. This seemed interesting enough to do a new thread about, and it also makes sure I'm not doing it wrong!! )

I'm putting together a budget build for a family member and I was having a hard time deciding on an appropriate CPU. Right now I'm torn between the Intel Core i3-2100 and the AMD Phenom II x4 840, and I got to thinking about long-term power efficiency and it's effects on the Total Cost of Ownership of a given CPU. The best I could do to find numbers that were comperable was this article which has both the i3 and the 840 on it's power consumption page and uses the same methods and hardware for both. The numbers:

As we can see, the i3 is more expensive but more efficient. I want to know how this plays out over a expected usage lifetime of four years. My knowledge of 'electricity math' is limited, so check my numbers! I saw somewhere that the 2010 national average is 9.88Â¢/kWh, so that's what I'll use. Here's what a four years of idling gives us:

Wow, we can really see the effects of power efficiency magnified over time! How about a more extreme example: four years at full load.

Granted, full load for four years is...unlikely...but it makes for a neat example. Now what would the TCO of these chips be for a typical family machine? It's only on for, let's say, an average of 7 hours a day. Average CPU load is going to look like 15% at most on either of these chips. By my (hopefully correct!) approximation, that takes the power consumption of the

So now we come down to it - Total Cost of Ownership, or the cost of purchase added to the cost of use. Will the power efficiency of the i3 win out out, or will the significantly cheaper 840 retain the lead for this light workload?

I'll admit that I'm quite a bit surprised; I thought that the i3 would catch up to the 840 much more quickly. Obviously heavier lifting and longer uptime favor the i3, but how long would it take the i3 to surpass the 840 in terms of cost-over-time?

So, did I do my math right?

Note: This is obviously quite simplified and was mostly done for my own amusement. I'm aware that the i3 will complete more work 'per load' and thus increase it's efficiency advantage further, but I judged it an unnecessary complication in this instance. Additionally, AMD's integrated GPU for the 840 is weaker than the the Intel HD 2000 which may necessitate a discreet card for certain tasks and thus tilt the balance in the i3's favor even more.

Which of these would

I'm putting together a budget build for a family member and I was having a hard time deciding on an appropriate CPU. Right now I'm torn between the Intel Core i3-2100 and the AMD Phenom II x4 840, and I got to thinking about long-term power efficiency and it's effects on the Total Cost of Ownership of a given CPU. The best I could do to find numbers that were comperable was this article which has both the i3 and the 840 on it's power consumption page and uses the same methods and hardware for both. The numbers:

**i3-2100**, ASRock Pro3-M Z68 - $220.48, 56w idle, 86w load**x4 840**, GIGABYTE GA-880GM-D2H - $142.40, 91w idle, 166w loadAs we can see, the i3 is more expensive but more efficient. I want to know how this plays out over a expected usage lifetime of four years. My knowledge of 'electricity math' is limited, so check my numbers! I saw somewhere that the 2010 national average is 9.88Â¢/kWh, so that's what I'll use. Here's what a four years of idling gives us:

**i3 -**$194.00 = 1,963.58 kWh @ 9.88Â¢ each**840 -**$315.25 = 3,190.82 kWh @ 9.88Â¢ eachWow, we can really see the effects of power efficiency magnified over time! How about a more extreme example: four years at full load.

**i3 -**$297.93 = 3,015.50 kWh @ 9.88Â¢ each**840 -**$575.08 = 5,820.62 kWh @ 9.88Â¢ eachGranted, full load for four years is...unlikely...but it makes for a neat example. Now what would the TCO of these chips be for a typical family machine? It's only on for, let's say, an average of 7 hours a day. Average CPU load is going to look like 15% at most on either of these chips. By my (hopefully correct!) approximation, that takes the power consumption of the

**i3**to 60.5w and the**840**to 102.25w. What would our electricity cost look like over four years?**i3 -**$61.13 = 618.73 kWh @ 9.88Â¢ each**840 -**$103.32 = 1,045.71 kWh @ 9.88Â¢ eachSo now we come down to it - Total Cost of Ownership, or the cost of purchase added to the cost of use. Will the power efficiency of the i3 win out out, or will the significantly cheaper 840 retain the lead for this light workload?

**i3 -**$281.61**840 -**$245.72I'll admit that I'm quite a bit surprised; I thought that the i3 would catch up to the 840 much more quickly. Obviously heavier lifting and longer uptime favor the i3, but how long would it take the i3 to surpass the 840 in terms of cost-over-time?

**Idle -**2 years, 92 days, 17 hours, 5 minutes**Load -**27 days, 1 hour, 6 minutesSo, did I do my math right?

Note: This is obviously quite simplified and was mostly done for my own amusement. I'm aware that the i3 will complete more work 'per load' and thus increase it's efficiency advantage further, but I judged it an unnecessary complication in this instance. Additionally, AMD's integrated GPU for the 840 is weaker than the the Intel HD 2000 which may necessitate a discreet card for certain tasks and thus tilt the balance in the i3's favor even more.

Which of these would

*you*get for a basic home office/web machine (if either)? Is the HD 4250 graphics sufficient for these tasks?