Something that I think people here are forgetting is that while console graphics have gotten better during the life-cycle of this generation, the resolution at which they are coded is much lower than what most PC gaming enthusiasts are using and hasn't increased since I last looked, for example MW2 was coded at I think 1024x600, then upscaled to 720p/1080p (stretched to be viewed at higher resolution displays but with lower image quality as a result, for example a 600p image stretched to 1080p over 32"-50" HDTV's).
Then there is the lack of Anti-Aliasing on the current gen hardware (although I think there have been some workarounds with shader smoothing?)
New graphics engines have allowed gradually better looking textures and other graphical features to be used on console hardware (consider forza 4's lighting system) but there are a hell of alot of optimizations in this area where some textures have been lowered just to keep things running smoothly, and can result in some horrible quality images.
I think what I have written above are some of the contributing reasons why console games still look "almost" as good as PC games on such old hardware, but there are now lots of PC games that are ridiculously better in the graphics department.
As a test play some games on PC at 1280x720, with medium graphical quality and no AA/AF, and use a 360/pc controller. You will most likely think your playing on a console.
The other fact is that most console gamers like someone else has said, don't actually care too much whether your PC is this much faster than their console, what they see on the console is what they are given and appear to be happy with that, and don't seem to pine for faster performance, more frame rates or even consider what resolution a specific game was actually coded at, they've got a HDTV and a shiny new console! woop! and it ends there it seems, any thing extra they think is a luxury like facebook and youtube being added to 360 was like **** amazing to most console gamers.
There may be more to this but I'll sit back for a bit now.
If I had kept my 7 yr old PC back when I had a 1950XTX GPU, and just added a quad-core CPU and played at 720p resolution, with like no AA to 2xAA and no AF at medium settings
Would I still be playing all the latest games? (considering in this thread someone linked a 1950xtx playing crysis 2). YES
The other point to the whole console gfx thing is that during optimization, image quality and overall textures and other graphical effects have to be limited because of the hardware. As a result developers had to come up with innovative methods to keep improving the look of console games, which is actually positive.
What I cannot understand is, why use a GPU that will be 2 generations old by the time it is used in the new console? I would suspect this News article to be speculation and really they will be using a higher end one like the 6950/70's or 7950/70's but with lower clocks and custom made for the new console to suit costs.
Edited by jameschisholm - 1/24/12 at 3:59pm