Originally Posted by kennyparker1337
The idea of developers being able to code directly to hardware instead of having to code through high level language that has to go through multiple levels of drivers is not relevant to just Microsoft but the entire idea of a console.
Sure the gains over time isn't going to compare to upgrading a PC but it's absolutely FREE
If you had the decency to look at first year games vs last year games on last gen consoles you would realize that clearly games can be vastly improved upon over time by getting your code to near 100% proficiency at the hardware level.
Call of Duty 2 (launch title) vs Battlefield 3 (finishing title)
You missed the point. The last-generation consoles had some very different hardware (360's GPU had the very first unified shaders, the PS3 used that weird Cell architecture) and the industry as a collective whole was actually making the transition to programmable unified shaders and multi-core CPUs. It wasn't just consoles that improved with the same hardware, as people rocking 8800s could attest to. There was nothing console-specific about the massive improvement in software development tools we saw and hardware optimizations stemmed from esoteric architectures and adapting to them. Neither of these things apply this time around.
Originally Posted by ZealotKi11er
The CPU alone cost $300. Not going to happen. 970M? Thats 20-30% faster then HD 7870.
First off, the 7850 is a better representation of the PS4's GPU due to the reduced shader count and low clock speed. Second, the 970M is much faster than that. Third, a downclocked/low-binning i5 bought in bulk should certainly not be $300. Not talking about i7s or the markup mobile OEMs tend to charge.
Originally Posted by maarten12100
Yeah that seems almost correct. At 2-2.4GHz my 6410 punches out about a 2 in Cinebench with double the cores a 3,5 is easily doable and 2 threads going to the OS doesn't mean they won't be used they are obviously handling tasks too background tasks on a pc slow it down too.
sandy i5 performance @2.5GHz it seems plenty for the gpu it is using considering I use a R9 290 with my i5 and I am not bottlenecked.
Ubisoft devs just can't program or are cutting cost
Yes I agree as you can see above that linear scaling won't happen but one can come really close on as little as 8 cores.
Jaguar cores punch out pretty good scores even at the 1.6GHz range so we have a cpu stronger than pretty much all standard clocked dual core chips on the market 3GHz range. Which do fine in games thus Ubisoft stinks.
I think you completely skipped the part about how CPUs function. They're sequential processors by nature, not parallel ones. There are many, many, many tasks that cannot be split across cores at all, let alone in a perfectly equal manner that maximizes each core without the whole CPU being held back by any particular core struggling with a given thread/workload that is itself incapable of being split. Weaker single-threaded performance only makes the situation worse, in actuality, because the CPU is that much more likely to be held back by that single core choking on a task too great for it. This is not a trivial detail, I don't think a lot of people the faintest idea how CPUs function. In games, an 8-core (6 in this case, effectively) Jaguar will never even match a dual-core Haswell even if it "theoretically" should be able to with all cores added up because that is not how CPUs function in the vast majority of software. This is an inherent limitation, ask any programmer and google "Amdahl's law". Video editing and similar benchmarks like Cinebench are part of a very select few of exceptions.