There's no need for two graphics cards these days unless you go insane with resolution like with Eyefinity perhaps and even then I wonder...
One Radeon 5870/6970 or GeForce GTX 570/580 is by my experience, a top performer for everything you throw at it at 1920x1080, maximum anti-aliasing, maximum settings, etc, etc (for any game I play). That's why I always think SLI and Crossfire are just there to make NVIDIA and AMD more revenue at the end of the year (there, I said it!
A decent enough quad core CPU such as an Intel Sandy Bridge Core i5-2500K is more than enough to eat any game you throw at it. Add to that 4GB to 8GB of DDR3 RAM and you're sorted.
It doesn't run you down so much at all. The problem comes in when you need constant performance enhancements for things such as rendering (3ds Max, in my case). Either that or you get an upgrade OCD which REQUIRES you to get the best and latest each and every time without any sense for what you're getting extra. I find people who have two GeForce GTX 580s, both bought together when they first released pretty much covers this OCD bit. If anything you'd buy a second card if and when you notice just one card lags you (as if it would, especially when overclocked!).
So yeah, in conclusion, I very much agree - for gaming there isn't much point. My brother is very, very good at this because he upgrades every four years or so. He recently got an Intel Core i5-2500K system with 8GB of DDR3 RAM and a Radeon 5850 (he bought this card a few months back before the 6000 series came out). He has now bought an SSD as well so I guarantee that he's right in that his games will not lag for another four years or so.
I'm weaker than him. I don't upgrade graphics cards often, because new DirectX features are rarely used these days and decent cards these days are top performers for years (without SLI or Crossfire, too!
). CPU-wise however, I like to keep up quite a bit for 3ds Max. Watercooling is also an expensive hobby to get into but it's cool! Edited by Gib007 - 3/23/11 at 4:20am