Bill Gates has often said that over time, the cost of computer hardware approaches zero. Here's one such example:
Ten years out, in terms of actual hardware costs you can almost think of hardware as being free.
History has proven him right. Computer hardware isn't literally free, of course. But it's effectively free relative to the level of computing power you're getting for your dollar. What does it mean when computer hardware is effectively free, and getting even more free every day?
For one thing, computer software starts to look incredibly expensive. But let's put aside the ratio of software cost to hardware cost for now.
If you're Google, or any other company building out massive datacenter farms, cheap hardware is a strategic advantage. It means you can build larger and larger datacenters for less money. Computers may be smaller and cheaper than ever, but they still require electricity to operate. You now have a new problem. The electrical power used to drive all that free hardware you've amassed becomes your greatest expense:
Over the last three generations of Google's computing infrastructure, performance has nearly doubled, Barroso said. But because performance per watt remained nearly unchanged, that means electricity consumption has also almost doubled.
If server power consumption grows 20 percent per year, the four-year cost of a server's electricity bill will be larger than the $3,000 initial price of a typical low-end server with x86 processors. Google's data center is populated chiefly with such machines. But if power consumption grows at 50 percent per year, "power costs by the end of the decade would dwarf server prices," even without power increasing beyond its current 9 cents per kilowatt-hour cost, Barroso said.
Computer hardware costs may be approaching zero, but power costs are fixed-- or rising. The thirst for power in the face of increasingly large datacenters has driven Google to build datacenters in out-of-the-way places where power costs are low:
Google, for example, has watched its energy consumption almost double during the past three generations of upgrades to its sprawling computing infrastructure. It recently unveiled a major new datacenter site in a remote part of Oregon, where power costs are a fraction of those at Google's home base in Silicon Valley. But cheap power may not be enough. Last year, Google engineer Luiz AndrÃ© Barroso predicted that energy costs would dwarf equipment costs -- "possibly by a large margin" -- if power-hungry datacenters didnâ€™t mend their ways. Barroso went on to warn that datacenters' growing appetite for power "could have serious consequences for the overall affordability of computing, not to mention the overall health of the planet."
Google doesn't just build their own servers. They build their own power supplies, too:
The power supply to servers is one place that energy is unnecessarily lost. One-third of the electricity running through a typical power supply leaks out as heat, [Urs HÃ¶lzle] said. That's a waste of energy and also creates additional costs in the cooling necessary because of the heat added to a building.
Rather than waste the electricity and incur the additional costs for cooling, Google has power supplies specially made that are 90% efficient. "It's not hard to do. That's why to me it's personally offensive" that standard power supplies aren't as efficient, he said.
While he admits that ordering specially made power supplies is more expensive than buying standard products, Google still saves money ultimately by conserving energy and cooling, he said.