Originally Posted by Cherryblue
This is probably quite an interesting subject.
While you're right about the time needed to come to market, I'm not sure we get to see every chip ever designed by Intel.
Remember Larrabee, Intel's x86 GPU that got canceled? Well, it turns out that it was in a nearly-finished state:
So let's talk about the elephant in the room - graphics. Yes, at that we did fail. And we failed mainly for reasons of time and politics. And even then we didn't fail by nearly as much as people think. Because we were never allowed to ship it, people just saw a giant crater, but in fact Larrabee did run graphics, and it ran it surprisingly well. Larrabee emulated a fully DirectX11 and OpenGL4.x compliant graphics card - by which I mean it was a PCIe card, you plugged it into your machine, you plugged the monitor into the back, you installed the standard Windows driver, and... it was a graphics card. There was no other graphics cards in the system. It had the full DX11 feature set, and there were over 300 titles running perfectly - you download the game from Steam and they Just Work - they totally think it's a graphics card! But it's still actually running FreeBSD on that card, and under FreeBSD it's just running an x86 program called DirectXGfx (248 threads of it). And it shares a file system with the host and you can telnet into it and give it other work to do and steal cores from your own graphics system - it was mind-bending! And because it was software, it could evolve - Larrabee was the first fully DirectX11-compatible card Intel had, because unlike Gen we didn't have to make a new chip when Microsoft released a new spec. It was also the fastest graphics card Intel had - possibly still is. Of course that's a totally unfair comparison because Gen (the integrated Intel gfx processor) has far less power and area budget. But that should still tell you that Larrabee ran graphics at perfectly respectable speeds. I got very good at ~Dirt3 on Larrabee.
Anyway, back to normal CPUs:
With the money they have, you could imagine them making all the design steps with both 4 cores design and more, but not going over to production or testing phase for more cores depending on the market state at each generation (idk where they could stop exactly for it to be worth, not much knowledge on this front).
- They do Xeon with more than 4 cores every now and then (meaning they do not have "nothing" even if differences)
- Thuban Phenom II X6 first real consumer 6 core processor, obviously must have caught their attention at the time
- FX, even if hugely disappointing, must have raised Intel interest at least before being released.
- If we're raising eyebrows about 4 cores staying for like a century in IT, Intel must have too.
Well anyway, at one point they probably decided to anticipate it in a intel-cost-effective-way.
...which DOES come back to Intel having the possibility to release more than 4 cores on consumer market, but not doing so ; the contrary would be like leading-domain companies not doing any R&D which is wrong. They do, but as their current offering still brings tons of money, they simply don't bring what they find to the masses; but they have it ready in a TBD short time enough to rival any challenger... or so they hope
....which ends my terrible monologue
- Xeons with more than four cores prior to Coffee Lake weren't on any flavor of LGA-115x though. The Xeon E-2100 line introduced those recently, back in July. Anything with more than four cores would have been on LGA-1366 or 2011.
- AMD's Thuban was launched a month after Intel's Gulftown, so while Intel took note of it they didn't need to prepare a competitor. Gulftown was the clear winner between the two if you were willing to pay for it, but Thuban was very competitive in most metrics. Gulftown's C0 stepping widened Intel's lead if I remember correctly.
- Alternatively, Intel didn't fall for the CMT hype and realized an FX 8-core would behave like their Hyperthreaded quad-cores. One of their top engineers would troll, and AMD employees would say things they probably shouldn't
, which is rather funny.
- Eh, it's four cores for mainstream. After LGA-1156 launched, Intel diverged their market into mobile-first mainstream and server-first HEDT. Workstations have always had access to 6+ cores this decade.
Actually now that I think about it, I wonder if the first 6-core mainstream CPU was supposed to launch with Cannonlake. Remember, 10nm Cannonlake was scheduled for 2016 immediately following Skylake. I hope we can all agree that Intel's problems with 10nm have thrown a wrench in things. It's pretty much confirmed that the reason we're on Skylake v4 with the 9000 series is because Intel was saving the fun stuff for 10nm, but since that was delayed FOR THREE ENTIRE YEARS we won't be seeing it until this year with Sunny Cove. That would still be ten years on quad-cores (Core2Quad 2006 to Skylake 2015) if we ignore the divergent HEDT platform, but AMD similarly had quad-cores for eleven (Phenom X4 2007 to Bristol Ridge 2016) with the exception of the Phenom II X6 for about a year. All the big core increases have been in HEDT platforms, and the only reason FX has been so common is because it's so cheap. If X58 processors and motherboards were still available in 2019 for as little as FX is, hell yeah everybody would have a 980X!
I need to collect my thoughts on this. It's an interesting topic.