Originally Posted by brucethemoose
True, but it seems unlikely. Turbo core is nice in a laptop/desktop/server where workloads can be "bursty", and consistent performance isn't a must.
Not in a console, Sony could hand controls over to developers with set specifications, for example 8 cores at 1.6Ghz, 6 cores at 2Ghz, 4 cores at 2.4Ghz and 2 cores at 2.8Ghz.
Originally Posted by Porthios
Here's a suggestions. Take away 4 cores from your idiotic CPU, and upgrade to a GTX 680.
That's a stupid idea, as it gets harder and harder to noticeably improve graphics due to the law of diminishing returns, developers are moving to AI and physics among other things which will place a heavy load on the CPU.
Originally Posted by xoleras
RAM is shared for everything in consoles. OS, VRAM, etc.
That depends on the console, it is on the 360 but not the PS3 as far as I know, that's why you don't have a gun holstered inside interiors without combat in Mass Effect 3 PS3 but you do in the 360 version due to the PS3 not having enough RAM for the animations, because (Numbers pulled out of my rear end here) let's just say that the games textures use ~200MB of space, the animations, etc, all take up ~240MB and the OS the rest of the space, on the 360 they can use the 56MB saved from what would otherwise be the GPUs memory pool to make up for the extra stuff, on the PS3? Not so lucky.
Originally Posted by Coach Mcguirk
Didn't someone say the Xbox 720, or whatever, had a AMD 6670? How would it compete with a "gtx 570 equivalent" PS4
The same way that the Wii outsold the PS3 and 360?
Originally Posted by Blameless
Originally Posted by hzac
I really hope 4gb is enough ram. I dont want sony to repeat their mistake.. Surely theyve learnt from last gen..
4GiB in late 2013 won't be much different relative to PC than the 512MiB total the PS3 had relative to PCs in 2005-2006, and a unified memory setup will allow more flexibility.
It should be enough for 5 years since it's unified, RAM requirements aren't ballooning as fast as they were in 2005-2006 because of the whole 64bit transition.
Originally Posted by Moustache
Originally Posted by mohit9206
ps4 most likely would stick to either the A10 or A8 APU with some sort of discrete graphics running in crossfire mode whenever required.
no way they can make a ps4 with 8 core cpu and 7970M and sell it for $400 . dont get your expectations too high
This. Game consoles are made so that they're affordable. Anything beyond that will be better if you build a gaming PC instead.
Just like the 360 launching with a x1950 speed GPU when that was still new, right? The 360 sold for $400 and had a faster GPU than the PS3...Most of its costs were down to the BD drive, (Which were still very expensive at the time) the fact it also contained an entire PS2 onboard and a few other things.
Originally Posted by Alatar
Originally Posted by Bit_reaper
I don't know why people keep saying this. Almost all consoles have been state of the art when they where released. Even the current gens Xbox360 despite power, size and cost concerns still sported a very advanced 3x PPU CPU and the worlds very first GPU with unified shader architecture and the PS3 Cell processor was no less impressive when it was made.
This at a time when most PC's where still running Pentium D and Core 2 Duo was the fastest stuff money could buy.
People got to remember how freaking old the current gen consoles are.
Today top of the line parts aren't possible for power consumption and cooling requirements though since those are much higher than they used to be.
They weren't that
much more efficient back then, CPUs especially...They've only increased dramatically in efficiency, Intel's i5s use less than 100w of power on their own which is lower than what a P4 used. (Conroe/Core 2 Duo wasn't out until 2006)
Originally Posted by almighty15
Originally Posted by Blameless
However, I do think you are exaggerating the rarity of Pitcairn dies that could qualify for specs of the rumored PS4 GPU.
I'm not, don't you think that if AMD could make enough of these GPU's to be used in the millions of consoles they would release them for PC too?
The 7970m has a 15% drop in clock speed for a 50% reduction in power consumption, you do the math
They don't because not as many people would buy them, Sony would go to AMD and say they want several million GPUs, AMD contracts TSMC to make them and then sends them to Sony...They likely could be spitting out many more GPUs now 28nm yields are up and the process is maturing but what would be the point if Dell, HP, etc don't want them? There aren't enough PC gamers who would buy one either.
Laptop parts also outsell desktop parts by a large margin last I checked, so I'd imagine that HD7870s and HD7850s are simply just Wimbledon chips that didn't make the cut for a laptop chip; not ultra high binned Pitcairns as you say.
Originally Posted by ZealotKi11er
At the same time PC had 8800GTX which was a lot faster the 7800GTX 256MB. Add SLI more RAM and the PC was 3-4x faster when console launched. This was a expensive unit too not 299.99$.
Compare it to the 360 then, which was out a year earlier than the 8800GTX, cost $399 at launch.
Also, go look at prices back then...You won't be building a PC with even one 8800GTX for under $600 considering the 8800GTX cost that much alone.