Originally Posted by Cavi Mike
I said this before and I'll say it again:
Consoles don't use new tech, they use proven
You will NEVER see a console with new tech. NEVER. As a company - if that new tech fails, do you really want to deal with the repercussions? Of course not.
Besides, we all know that you don't need new tech to perform well. You just need specially engineered tech along with specially engineered games. Just like the new slim PS3 can't play PS2 games. It doesn't have the processor power to emulate the Emotion processor but does that mean it has less power than a PS2? Obviously not.
This. There was one console that used new tech..The PS3.
Remember how much it cost at launch? Sony and Microsoft do not
want to see that happen to themselves (Again for Sony)
Originally Posted by A Bad Day
Why compete against your competitor's strength (Intel's x86 CPU performance), when you can flank it by pushing for widespread adaptation of GPU-computing?
Anyone remember the era where the CPU did the graphic workload and the GPU simply converted one type of signals to another?
Actually, clock for clock AMDs CPUs are very close to Intels when all 8 cores are used, if anything it'd be in both companies interest to get GPGPU going easily on their iGPUs, imagine buying an AMD or Intel APU and using the iGPU and quite a bit of the CPU for physics calculations so your main GPU still has all of its horsepower for rendering, that'd be amazing.
Originally Posted by Hokies83
LoL yeah but i mean a non Xeon Choice something sub 800$
If Intel launches an 8 core IB-e or Haswell-E, it'll most certainly be an EE...So Skylake-E at the earliest will be when we get a 3930k 8 core style, I reckon.
That is of course unless games really
start using CPU power, Intel has 8 core dies and likely would provide a cheaper model due to the extra demand..Then again, you never know.
Originally Posted by Hukkel
What are you babling? This is not proven tech. It is as new as it gets. It is designed specifically for this application. The right term is high end. A console cannot use the pcs high end tech because it is way too expensive, produces way to much heat and uses too much energy.
Nvidias comparison was utterly stupid. Comparing a 500 euro gfx card with a 500 euro console. It was the dumbest thing they could do. They should have just shut up and moved on.
If it is true what they claim I personally think a hd7850 in a console is pretty awesome. It is an enormous step up from the current gen. And seeing how some current titles look I think most gamers will be very happy.
Actually, it is proven tech apart from the process node...GCN has been around for a year and while Jaguar is new, it's an upgrade of a previous generation and not unproven tech, same with APUs in general considering we're on our 3rd generation Intel and AMD ones, about to see the 4th generation.
Originally Posted by S.M.
The CPU is Jaguar and the GPU is GCN2.0, both proven tech.
Jaguar is currently powering the supercomputer behind the light deck at Stony Brook University. It's not the bulldozer architecture.
It's an upgraded Bobcat, I believe.
Originally Posted by Artikbot
There we go again.
PD is not a true quadcore only if you make use of 256bit AVX, then both FMA units fuse into a single 256bit precision one.
To all other intents and purposes, it is an eight core with four shared decoders.
Back on topic. It was obvious that NVIDIA was mad because they didn't secure even one deal.
Even then, the logic that you have to have an FPU to be a real core means that we didn't have an x86 CPU core until the 486 came out...Before then, you had to get an external co-pro FPU if you wanted to do floating point calculations quickly.
Originally Posted by Darkpriest667
Um ok... so you know something about the new OS we don't? The buzz is that 720 is running a windows 8 variant. Plenty of overhead. No idea what Sony is running but theyve never been real BRILLIANT at optimizing software.
Firstly, while it would be a Windows 8 variant that just most likely means it'll have the kernel but with nearly all of the crap stripped out...You know how a lot of gamers want a version of Windows that comes with the bare essentials on it? That's the Xbox OS. As for Sony used an OpenBSD derived OS for the PS3, and I'm guessing will continue along that route.
Most of the optimization is also done by the developers, not Sony and MS...They reduce overhead of their OS (Mainly RAM usage as the OS would use bugger all CPU power while in the background) but the rest would mostly be up to the developers, what does
matter from Sony, Microsoft, etc is their documentation on the hardware, if they offer good documentation then developers will be able to make good use of the hardware quickly, otherwise it'll take time.
Originally Posted by black7hought
This has the potential to be great news for all of us. If AMD, Sony and Microsoft can get console software developers to utilize all threads and it transfers over to PC gaming then that is good news whether you're an AMD or Intel CPU owner. It also may give AMD the financial boost they need to implement their HSA plan which may also bring good changes to our computer building future.
becuz da shiny graf sais amd gets less fps dan intel.
Originally Posted by thegreatsquare
The PS4 "OS" is based on Windows 7.
Proof? I really
doubt MS would license it to Sony, plus give them source code access. (Or that Sony would accept Win7 without source code access knowing MS could screw them over somehow with that)
Originally Posted by RagingCain
AMD gave Sony a price lower than nVidia, making money on the backend (i.e. percentage of sales) != Hardware nVidia Couldn't Have Made.
I hate these idiots at AMD. Shut up and work on your drivers, Skyrim is broken still.
Skyrim works fine on my HD7950 and HD7850 in separate rigs. Just because you have issues, doesn't mean everyone does.
Originally Posted by AznDud333
remember that clock for clock amd chips are still slower than intel chips, and it doesnt help that ps4 chips are clocked at 3.6ghz..less than half of the average mid end intel chips today...and compared to a 3930k its hella slow
Keep in mind that the typical CPU usage of a game even on Windows is pretty damn low, people seriously over-estimate the role of the CPU in gaming and how much of a difference it actually does make.
Plus, if the PS4 remains based off of BSD like the PS3 did, it'll get more out of any CPU than that same CPU under Windows...Why do you think folders use Linux?
Originally Posted by A Bad Day
Intel promised a 10% performance increase in their GPU with a driver update.
Tell me that 10% does not matter.
EDIT: A while ago, Tom's Hardware did a GPU driver experiment on a 4850. They tested a variety of drivers ranging from the first stable release for the 4850, and the latest one when they started doing the benchmarks.
They concluded by staying up to date with the graphic drivers, the updated 4850 matched an un-updated 4870 in performance.
10% does not matter depending on the situation, when you're sitting at 30fps minimum then 10% (Or even 1%) matters a lot as any gain is useful, but if you're sitting at say, 60fps or 120fps minimum then 10% doesn't matter any longer...as for CPUs, I still doubt that most people would notice if their Core i*s suddenly changed to Phenom IIs or FX's mid-game without staring at fraps or something.
Originally Posted by mcg75
Nvidia is doing just fine with Tegra. Project Shield is probably the dumbest idea from Nvidia ever though.
PS4 will be a low end PC as will Durango. Consoles don't need to be top of the line because it's a hell of a lot easier to optimize games when you only have one hardware set to code for. Essentially they will be low end PC's with graphics that will challenge high end PC's for the first couple years.
Mid-range, not low-end.
Although it'll go from the middle of the mid-range to the lower end of the midrange by the time the PS4 launches, to be fair.
Originally Posted by AznDud333
no one said 10% didnt matter, but 10% isnt gonna make up the difference between a 570 and a 680...the difference is almost 200%..
What's the difference between an 8800GTS and an ATi x1950XTX? Here's a hint: Kepler was nothing compared to the massive gain we got overnight from the 8800 series which was literally double the previous fastest GPUs and beat CFX/SLI setups left and right with one card...And the Xbox 360 (240 GFLOPs) is still going to be slightly under the x1950XTX (375 GFLOPs) in terms of pure hardware, yet thanks to optimization it pulls numbers similar to a modern 8800GTS. I find it funny people are still pulling the "Optimization can only do a little!" line considering it's been disproven time and time again, even by just a mere glance
at the console hardware specs and what the graphics they're putting out look like.
Originally Posted by vampirr
Yea and 7800GT dint do the job well that game developers owned by Sony were forced to use CELL's processor to run graphics better than 7800GT. Cell is an distant relative to APU in some form since it ran CPU and GPU loads with Uncharted 2 and 3...
That was because Sony was trying to push the Cell as much as possible and wrote the OS to discourage the 7800GTX's use.
What about Xenos? It's a 240GFLOPs GPU putting out the kind of graphics I'd expect from a much faster GPU in the 400GFLOPs (HD2900XT/8800GTS) range.
Originally Posted by The Robot
Source please? I highly doubt that, MS definitely won't allow it, and Sony has more expertise with Unix, since PS Vita runs some kind of it. But yeah, it will be a more like a full-fledged OS, rather than a simplified game-runner, hence all the social stuff, etc.
PS3 OS is based off of OpenBSD, I believe the Vita's is too and would wager that the PSPs was too considering the similarities between those 3 that they don't share with the PS1/PS2.