Originally Posted by Abundant Cores
Anyone get the feeling that some for whatever stupid reason are trying to convince a wider audience that there will never be a Steamroller FX chip?
Kind of like how there were tons of people screaming "28nm TSMC WILL NEVER BE READY UNTIL LATE 2012!!!" and then 7970 released in January 2012?
People are always spreading FUD about AMD. I have no idea if people just genuinely feel that way or if it's folks working for Intel PIE or something, but before AMD releases something good there's nearly always some sort of strong FUD campaign to make AMD look horrible and to make people question what AMD will be doing in the future.
Now we hear "22nm IS SO FAR AWAY!!" and IBM is making 22nm SOI chips right now
If AMD can fit 12 cores on a 330mm^2 die there's no reason why they shouldn't release one. The margins would be a lot better and they could sell semi-disabled chips as 8 cores and 10 cores. It also would mean AMD could compete on the high end. The only thing really separating 4670k and 3970x is multi-thread performance and Intel is leaving a huge price gap between $340 4770k and $570 3930k where AMD could completely run amok and own the market.
AMD needs to shake the budget and value ONLY image they've built. People who got into CPUs after Bulldozer only know AMD as making good chips in the lower price segments. You can pick some of these Intel fanboys out because they make horrible claims like "AMD has never ever competed with Intel in performance" or whatever, when clearly they have many years ago.
Not everything is going to be able to use HSA right away, and if AMD has no high end traditional parts then people who are stuck between having some HSA enabled apps and some not are going to have to choose between doing some things very fast and some things very slow or finding a middle ground.
Not to mention I think a lot of people are neglecting how much faster an x86 core is going to be when instructions are targeted on a specific platform where every CPU is the same as opposed to targeting several different CPUs with several different instructions sets on PC.
Let me put some of this into perspective for you from my Gentoo benchmarking on FX 8350.
I saw a little over 60% speedup encoding the same wav file to mp3 with the same settings using optimized LAME in Gentoo as opposed to the default one from the website in Windows.
That puts stock FX 8350 faster at stock
than 3770k in LAME, which has been a benchmark that's been a thorn in the side of FX 8350 owners forever.
8 Jaguar cores at 2.0ghz using AVX, per my testing, would put it roughly at the same level of performance as FX 8350 at 4ghz assuming that Jaguar is getting optimized code and FX is getting generic Windows code.
It all really just depends on how much the OS on the consoles need and if it can suspend itself or something. But regardless, I could easily see a Jaguar running optimized code with all the instructions giving an FX 8350 a difficult time provided all cores are loaded and FX is running generic code relying on x87 or SSE1.
I do think you guys are missing that 10 cores might actually be the norm if you want to play next gen console ports and stream or multi-task or whatever it is FX 8000 owners like to do on their gaming rigs.
People have gotten awfully complacent with getting console ports designed to run on, at best, 3 general purpose cores and an x1900. As I was typing this I had to look back and realize that I bought this 7970 in 2012
and I still play every game I want to on full settings at 1440p. I think a lot of people are used to this by now, and I remember in the early 00s going through CPU upgrades every year, at least, and new motherboards every other year or so. Not to mention graphic card upgrades constantly.
The new consoles are going to change things and it is going to put a lot of hurt on existing rigs, and I have a really good feeling that, given the new mentality of people as well as their complacency with not having to upgrade, are going to blame the game for being "bloated" and "slow" as opposed to realizing that the demands of games have simply gone up.
I have no idea when this attitude showed up in enthusiasts but it's alive and well, and I don't like it. I first noticed it in Vista, where we got a nice, 3d composited desktop and all sorts of caching and such, and everyone whined because it used more resources than Windows XP without a 3d desktop and all that other stuff.
But it's the central problem with the desktop community and why sales are down. People would much rather blame the software engineer for making more demanding software as opposed to upgrading. And now software engineers are targeting lower end hardware and aiming for efficiency instead of effects and visually pleasing things (why do you think Metro looks so bland? It's because it's not demanding and no one will complain that the 3d shadows on tiles are lagging their ancient GPU).
A lot of desktop users have dug their grave and they won't admit it, they just want to continue to point fingers at casual users or Intel or AMD for not making anything better. People don't need to upgrade from Nehalem or Phenom still and it's because software stopped growing overly demanding.
I do think that's going to change with next gen consoles and 10 cores will actually be reasonable, but everyone will go kicking and screaming and blaming software devs for bloat and whatever.
And just for the record, ever since the 00s I've always advocated more than what people thought they needed, and it came.
I wanted a dual CPU rig in the early 00s, and people laughed at me. Now you get laughed at if you have a single core.
I had 1.5GB of ram when everyone had 512MB and people laughed, now you laugh at 1.5GB
I had 4ghz Pentium 4 and everyone said "lol why do you need that much power?" and now we have ULV chips that would humiliate 4ghz Netburst
I had nearly 3ghz Opteron 165 and everyone laughed and said it was too much
Then I had 4ghz i7 920 and everyone said it was too much.
I simply don't understand why people have always been so opposed to going so far with hardware. Everytime people say "lol ur dum u dont need dat!" and then 3 years later everyone is using that.
We're already at a point where 8 cores can be practical for streaming + gaming, and that's running console ports designed to run on ancient hardware.