If someone were to ask you the question, "What is the single greatest advance in personal computing (up until now)?," what would your answer be?
There are so many possible ways to answer this question.
Some might nominate multi-core CPUs for the consumer market sector as the single greatest advance. Starting with the Pentium D Smithfield (May 2005), the multi-core CPU is a perfect expression of the "more is always better" philosophy. If one head is good, then two heads is twice as good. Especially as implemented by AMD and its Athlon64 X2 (and the dual-core Opteron) line, adding a second core to the system endows it with a mighty boost in performance balanced by a relatively small penalty in terms of thermal output and power consumption. This is especially true after the still-growing acceptance of poly-core CPUs by software developers; more and more programs are being optimized for multi-core CPUs, and the trend will likely continue into the future. Greater efficiency, defined by much-increased performance done in a smaller unit of time, is the moving target all power users try to hit.
While multi-core CPUs are a relatively easy choice to make, a more spurious and questionable selection would be the constant advances in GPU technology. The technological landscape in the world of graphics processing is vastly different in 2008 as it was just two years ago. The advent of the DirectX10 API and Windows Vista has something to do with this, as does the growing demand for HD video output, but I don't think that these are huge enough advancements over their predecessors. Graphics hardware technology is miles and miles ahead of the demands placed on them by software. Only a very few games are able to take advantage of all that power. It's a bit like owning a super-powerful car for the street, like the Bugatti Veyron. Sure, it has massive amounts of power and torque and potential performance, but without a proper road and legal permission to use everything that car has, it's a big waste, in my opinion.
(Thankfully, the GPU manufacturers now have enabled their products' power surpluses to do other things, such as folding.)
So, then, if it's not super-powerful CPUs or GPUs, what can be considered the single greatest advance in personal computing up until this time? My own nomination, actually, would be the proliferation of broadband internet access.
Just consider things for a moment before pronouncing me mad, fit to be institutionalized. For one thing, broadband, in all its flavors, is a massive leap forward from its predecessor. Dial-up internet is something I personally will never return to. Broadband is also almost universally accessible these days; there is a plethora of options available, and it's simply a matter of balancing costs, performance requirements, and the willingness to pay such costs. Not only that, but virtually all modern computers (and a great many machines two or three tech generations back -- say, back to the days of the original Pentium 4) have motherboards that supports some kind of broadband connection onboard. I don't think you can ever say that about either CPUs or GPUs.
Furthermore, you can argue that everyone can actually use broadband internet access. Not everyone needs a multi-core CPU, and certainly not everyone needs a GPU on steroids. But anyone can avail of a better, more enjoyable, more productive internet connection.
Of course, all this is just one person's opinion. I'd be glad to have you think about the question that spawned this blog entry, and contribute your own thoughts and opinions.
Thank you for reading!