I don't know about you, but anytime I look at a PC game's packaging, I make sure to check out the "Minimum System Requirements" recommendation that the game manufacturer/publisher prints. While I'm confident my dedicated gaming rig (which is no longer the top-of-the-line, by the way) and LCD monitor combination can handle most games reasonably well, it's always interesting to see how taxing a game can be on one's PC.
Perhaps logically, the newest games are all optimized to take advantage of the latest graphics hardware advances. I've often thought that, in the world of personal computing, the hardware side is so far ahead of the software. However, in the gaming segment of the market, the gulf between hardware and software is shrinking at the very top end. In other words, to be able to see the very best that a game can offer, the consumer needs to have a really top-flight system with memory and processing power surpluses, as well as a maximized graphics subsystem. Consequently, then, the higher performance ceilings that the very newest games demand also affect the bottom end of the scale: Minimum system requirements are also raised.
The bottom line is this: To be able to play the newest games, you need a rather more powerful rig than you had ever needed to in the past. A distinction must also be made between "playing" a game and "enjoying" it. Any system is capable of playing a game, even if it means that you only get 1-5 FPS (frames per second); to be able to enjoy a game, most experts specify a minimum frame rate of around 35 FPS. And it's absolutely true that the more "eye-candy" (advanced visual features such as anti-aliasing and anisotropic filtering and the like which improves the visual quality of a graphics image) you are able to run without taking an FPS hit, the more enjoyable a game's visual experience is.
(I am deliberately ignoring the actual game's design outside of its visual aspect, as these have less to do with hardware than the graphics do.)
So why am I writing about this? Well, perhaps nobody has stopped to think about this, but the inevitable evolution of hardware has a most unfortunate side-effect: It has made the latest and greatest games somewhat inaccessible to the majority of interested users. Sure, there will always be a tiny part of the market who have the means and capability to keep on chasing after the newest upgrades, but the fact is they represent the smallest part of the market. Far more numerous are people who must temper their enthusiasm with some sort of fiscal responsibility. Most enthusiasts, I would say, simply cannot afford to keep on keeping up.
And so to the point of this whole discourse: Wouldn't it make far more sense to keep system requirements more universally accessible, instead of catering to the highest technological levels of the market? You make a game that demands less from peoples' computers, and you obviously have access to a much wider audience. A wider audience means more sales, and therefore more money.
Instead, system requirements are elevated today to the point where the consumer really has to pay to play. While some people can justify budgeting for a total system upgrade often to play just the latest killer game, I think this is a foolish philosophy.
I guess I like having my money in my pocket too much...
As always, thanks for reading. Comments and discussion are always welcomed.