Originally Posted by keiths
>$50 difference is quite significant to most people.
> 50 USD means a lot to me.
At 0.10/kwh, $50 is 500kw of power.
At load, an 8150 @ 4.8 ghz uses 273 more watts than a 2600k @ 5ghz. http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/10
Same goes for the 8120 with it being same chip but with a lower default clock.
Which works out to 1831.5 hours in which the 8120 will burn up a $50 lower purchase price.
That works out to a year and a quarter averaging 4 hours use a day, a year and two thirds averaging 3 hours, etc.
Reading comprehension isn't a strong point for you is it?
The $50 price difference is between an FX 8120 and a 2500k.
It's a $100 price difference between an FX 8150 and a 2600k. If you are comparing the FX 8120 to the 2600k then the difference is $150. By your calculations I would need to run an FX 8120 4 hours a day for 3 and 3/4 years to equal the cost difference from a 2600k. Guess your point is pretty much useless when you use the correct models. By that time most people buy a new PC.
Not many people get a FX 8120 running at 4.8 GHz, it usually tops out at around 4.4 GHz to 4.5 GHZ and that's with water cooling. People who run water setups are not going to be worrying about $50 that much.
Your points are way off base, get your facts straight and learn to read so you understand what people are talking about.Edited by tout - 6/11/12 at 6:00am