Originally Posted by steelbom
There's no way consoles see that much benefit from having no overhead. The 6770 would at best rival a 6850.
You'd be really surprised, not only do they not have the whole DX API for overhead, there's also much less of an overhead from the OS, as well as the possibility of eDRAM, etc. Even if it only has the shader power of a HD6670 or a HD6770 and even if they had to code it with as much overhead as PCs (Much more likely, IMO) it wouldn't have the same performance as a desktop one because AMD would likely change the architecture a bit here and there, it'd have different RAM (Eg. PS3 has XDR RAM)
Originally Posted by RobotDevil666
Oh god not another one with this MAGIC tremendous super optimized 6670 nonesense , dude it's 6670 for christ sake you can polish a turd but it's still a turd.
Back in 2005 when X360 released X1900 wasn't a low end GPU and that's one of the main reasons why it's still ticking , if MS or Sony used 2005 equivalent of 6670 they would never last so long.
And by the way you're only 234647548585689th person posting this kind of "sensational news" in this thread.
Barely a year after the Xbox came out, the 8800GTX destroyed
the GPUs they have in the 360 now, it was seriously over double the performance. And they did use the equivalent of the HD6770 too...The 360s Xenon GPU is more around an x1800 with some features from HD2900XT than a x1900, just like the PS3 being around a 7900GTX rather than the 7950GT, for example. In terms of performance they were higher end than a 6670 (Which is why I think the 6670 claim is crap) but not the highest-end as everyone says. HD6670 isn't low-end either...It's mid-range.
Originally Posted by steelbom
You'll definitely see consoles performing better with a 6670 than a PC would with a 6670 because there's no overhead from APIs on the console, but the performance from the console is likely only 20-30% better and maybe at most 50%.
The games are only rendered at 1024x600, which is 614,000 pixels, if they maintain that resolution then the graphics improvement will be quite great, however if they start supporting 1920x1080, which is nearly 2.1m pixels, that 6 times performance improvement that they claim, will drop down to 1.8 which isn't much at all.
Most of the reason that consoles run at 640p (Not 1024x600, but half 1080p) is more due to the texture hardware, there's not enough vRAM and the TMUs themselves aren't that fast so there's no real improvement going to 1080p or the like. Going to a 6670 would only give a 1.8x shader performance increase when you compare 1024x600 to 1080p, but the texture performance increase would be much more substantial which is the current bottleneck for consoles. All in all, there's not even much more shader performance needed when you look at high-end PCs, we're nearing photorealism and tessellation helps a lot here, most of what needs to be done now is getting more and larger textures.
Originally Posted by TheBlindDeafMute
Yes, but the hardcore version will still be powered by the same gpu, as it wouldn't make financial or developer sense to have 2 versions capable of different performance.
I think the current hardware looks and plays fairly well. I'm somewhat new to the PC gaming arena, (last 2 years) and you only see the true top end titles and games taking advantage of a PC's power. While the textures are muddy, most current gen titles look decent. It's only when I play The Witcher 2, Crysis 2, or BF3 that I realize what I'm not getting on console. I think any major step up in graphics on consoles can only benefit pc gamers. I agree with the earlier statement that console games tend to be coded more tightly, and therefore more reliable.
Why would it be powered by the same GPU? They could easily have one that's like a HD6670 internally for the casual market (Ala Wii) and one that has the same basic architecture, just with more of everything for the actual 720. Or heck, they could even just put two of the same chip on the motherboard for the actual 720.