Originally Posted by Code-Red
^ I'll completely agree with that. They (along with PC developers) are going to hit a point on the "uncanny realm" (arguably we've hit that with computer animation movies) where it's going to take exponentially far more power to achieve better graphics than it ever has in the past. Not only that, but trying to fit faster and hotter components into a small case, with adequate cooling, while keeping costs down, AND making it reliable is going to get harder and harder.
The whole point to my argument though, was that graphics-wise, consoles upon their release and usually for a short while after dominate the PC. That was all I wanted to get across, because someone had rather pompously stated computers have always been better, which is very much not the case.
I also agree with that, and that is also the reason it will be difficult for consoles to keep pace for long this time. If you look at the last time, it was already rather difficult. The Xbox 360 already had overheating problems and the PS3, though it didn't suffer from the same problem, was one of the biggest consoles ever made, with a little over 200w power consumption for the first version.
Yes, definitely, consoles are indeed able to match, quality wise, what PCs actually take advantage of in the year they are released. Next year we will have the HD 8000 and the GTX 700, which means that HD 7970 and HD 7950 and GTX 680 and GTX 670 will effectively become mid-range cards, which is what the market is aiming now to release next year, and next generation consoles, even with hardware that is probably only half as powerful as these now current high end cards, will probably be able to match it, but after that it will be a struggle.
I forgot about something which also benefited consoles this time around: they were able to match PCs for a very long time because both made the same approach to multi-core usage. Back in 2005 dual core CPUs had just been released, and the consoles, with their multi-core CPUs, only saw them being fully used years later, and the same happened with PCs. The first games for the Xbox 360 used one core, just as with the PC version. So, for a few years, the CPU part of the equation was able to keep consoles on par, as both consoles and PCs progressively saw the developers take more and more advantage of multi-core CPUs. But now that we are here, the question is: how many threads can game developers effectively use ? After having learned how to use 3 or 4 threads, is that scalable to, say, 8, 12 or 16 ? Or is quad-core the best, most efficient approach ? That is also an interesting question. AMD is pushing for more than 4 cores, even though their initial attempt was arguably a flop due to unbalanced performance in lightly threaded vs. highly threaded applications, but the direction in desktop CPUs is that: six core Intel CPUs will probably become affordable starting next year, with the probably entry level 22nm IB-e being a six core CPU, versus this years Core i7-3820, and the year after that will continue the trend, and AMD will release Steamroller, with arguably much needed and improved performance when compared to the first Bulldozer.
With more cores you can invest a lot more in A.I, making the games not only look but also feel a lot more realistic, I remember a guy from Valve saying that starting with quad cores things got very interesting from that point of view, so I wonder what developers will be able to do with more CPU cores, for example.
So the question is: how long will the next console cycle last ?