This is one part that doesn't make much sense:
Originally Posted by article
The most expensive high-end cards should not automatically be the first choice for avid gamers! Instead, buying just as much performance as you really need for smooth frame rates (maybe with overclocking headroom) means that the money saved on a lower purchase price and energy consumption will make it much easier to afford yet another new graphics card next year. The high operational costs of an expensive flagship graphics card make this harder.
They rationalize that you ought to buy a lower-consumption card, and can then overclock it if you need the performance--doesn't this go against any power consumption calculations? Also, once you start overclocking, it's rarely a direct linear relationship between power consumed beyond stock and the performance gained beyond stock--typically there's that nice exponential relationship between power required vs. overclock achievable. Not to mention that you might be pushing the thresholds of the cooler on the component, etc.
I think the article overlooks the fact that few enthusiasts--or even the "avid gamer"--care about the actual cost in electricity it takes to power a certain graphics card, and more about general heat output, temperatures, and the hassles it generates. At least, that's the take I'd put on it if my message were geared towards the enthusiasts and people who would be interested in such graphics cards.
It's cool to see actual numbers--regardless of how they were arrived at.
That being said, I've never really cared much about temperatures so long as I don't encounter thermally-induced throttling... And I don't really care so much about how much juice my computer(s) use--it's nearly insignificant to the consumption of other household appliances.