I wrote a small article to take a very hard look at what we got from this current gen, and ask, is that all that was possible? This originally was an odyssey covering both consoles, but there was just too much going on so I decided to break it down into separate articles. If you feel something is missing/broken/disjointed let me know
biggrin.gif


The research into the BOM / manufacturing costs of the PS4 was done by the IHS Research Firm.
Quote:
The bill of materials (BOM) for the PlayStation 4 amounts to $372. When the manufacturing expense is added in, the cost increases to $381. This comes in $18 lower than the $399 retail price of the console.

IHS Source


If the PS3 is any indication of the PS4's future, we will see something like this regarding future costs.



Financially speaking, they could have done more with console's hardware. There was plenty of wiggle room this generation. Let's say there wasn't though. Either they would have eaten the costs (Lead-Loss Strategy) as they did with the previous three generations, which was very profitable for Sony, or passed the costs on directly to us. Still a viable alternative! The point of this article was to illustrate, that for a few dollars more, we could have had a considerable upgrade to GPU performance. To do that, I will be focusing my attention on the performance engine of the PS4, the APU and it's specific associated costs. The above information is to just to illustrate, that Sony chose to make money (or lose as little as possible) from the get go with this console. They chose to change strategy, they chose not to go with the high end this round.


What would it have really cost us though? Is it even remotely measurable? Let's try and figure it out
smile.gif

ps4-reverse-engineered-apu.jpg


Source: ExtremeTech's Reverse Engineer


Now it does appear to have been a full HD 7870 (20 CUs compliment) integrated alongside the two Jaguar quad-core modules.


This device costs Sony 100$ per APU in the report filed by IHS. What's a little sad is that 2 CUs are there just disabled! More than likely due to yield issues according to Extremetech, meaning the process could have been refined, they could have waited and given us another 10% extra computational performance at no additional cost to Sony or the consumer. AMD has done this in the past with selling chips where yields weren't great, as lower CPU models. This isn't an unusual practice though, disappointing none the less to the knowledgeable consumers.


The next set of figures are based on what if 100% of the APU cost was just the GPU portion of the APU. We know that isn't the case, but let's assume the other silicon just doesn't matter, just the GPU cost money.


HD 7860 ~equivalent, 20 CUs (18) @ 100$.







*Radeon HD 7860 is not a real GPU, just a calculated performance difference between an actual 7870 and 7850.


So if the GPU translated to 100% cost of the APU, the performance of approximately a 7970, it would have cost Sony another 60$. Just increasing the retail price still has Sony with a net +18$ per box.


Technically speaking, the chip designs are modular and the GCN architecture is scalable, so this isn't really that far fetched of a way to analyze. Price per CU includes all R&D, labor, fabrication, logistics, etc. Let's take a look at power requirements, could the PS4 have supported a 7970 equivalent? More than likely no, but let us see what is going on.

At full load the PS4 uses 150W at the wall* (according to PCPER ). Without a ground pin, the PSU is rated for 250W and looks highly efficient.


Due to the scalable nature of the APU / GPU and GCN in general, let's analyze the PS4 wattage usage compared to CUs (just like $s). Now power requirements are not linear, that is very true but let's just see what we get. What I would like to leave you with: "even in its current configuration, the PS4 is/was capable of more performance."




According to these figures, the power requirements for a 7970 equivalent are way too high for my liking. Keep in mind this is if the PS4's energy usage is ONLY the GPU and that the PSU is feeding 150W to components. Again the CU / Watt is partially acceptable based off Sony & AMD's optimizations already accounted for by basing it off the real-world performance / statistics.





Here, the 7970 equivalent is too high for me, but I would definitely run it with a 7950 with a peak usage of 189W at the wall. Given, again, that the APU will have the same efficiency and optimizations given to it when the 20 CU design was laid down.




At just 80% of the power draw, I could definitely see a Radeon HD 7970 equivalent inside the APU. Both TSMC and AMD are familiar with a larger die size necessary to establish this and the 28nm SOC design! The experience was already there! Just the design and the associated costs (heat, power, dollars) had to be paid for.


*Wall energy usage is always higher than what the PSU is actually sending to the system (no PSU is 100% efficient). Meaning, that the system is using less than 150W internally, but we will use 150W.


**Radeon HD 7860 is not a real GPU, just a calculated performance difference between an actual 7870 and 7850. Had they actually created a custom GPU that really was designed to only be 18 CUs, they could have saved a bit of power requirements. More than likely though, Sony lowered the voltage feed to the APU.


One final caveat, this is all hypothetical of course. There may, or may not also have been need of an APU cooling solution change, due to the excess heat, but AMD had an all around winner with the HD 7970 in terms of power and temperature design. When clocked and voltaged appropriately, the 7970 was about as cool as they came. I have a new found respect for what they were able to do, energy and temperature wise, with the PS4 APU, but they still could have done more.

The question is, as a consumer, would you have paid 450~499$ for the fastest console on the face of Earth?


I would have.