Overclock.net - An Overclocking Community

Overclock.net - An Overclocking Community (https://www.overclock.net/forum/)
-   Hardware News (https://www.overclock.net/forum/225-hardware-news/)
-   -   [WCCFTECH] Intel CEO Wants To Destroy The Thinking About Having 90% Share In CPU Market, Talks 10nm Problems, 7nm Roadmap And More (https://www.overclock.net/forum/225-hardware-news/1738282-wccftech-intel-ceo-wants-destroy-thinking-about-having-90-share-cpu-market-talks-10nm-problems-7nm-roadmap-more.html)

Kurumi Tokisaki 12-11-2019 12:55 PM

[WCCFTECH] Intel CEO Wants To Destroy The Thinking About Having 90% Share In CPU Market, Talks 10nm Problems, 7nm Roadmap And More
 
https://wccftech.com/intel-ceo-beyond-cpu-7nm-more/

Interesting read. Sounds like Intel is throwing in the towel on the CPU market. Unless I am missing something.

m4fox90 12-11-2019 12:57 PM

Recognizing they no longer are capable of living in a unipolar world is big. Just gotta wonder how long it will take for that to trickle down to their fans.

skupples 12-11-2019 01:14 PM

soooo lets start shorting intel?

mcbaes72 12-11-2019 01:36 PM

Hmm...maybe they saw Steve's video on Gamers Nexus which reported AMD is ahead in sales 9-to-1 and is searching for reasons to blame like too much time in CPU market, missed opportunities, etc. How about adding: "we charged consumers twice as much for years until AMD became relevant again and we pissed off our loyal customers".

I wish they talked more about where all that profit went to. Maybe the failed 10nm project, maybe the questionable GPU project, and so on.

I want to see if 7nm in 2021 really does happen to keep competition up or if AMD will continue to dominate with their Ryzen 4000 series and talks of 5nm process around the same time in 2021.

skupples 12-11-2019 01:45 PM

they pissed me off when they nixed the x820k sku. Slowest of the over 16 lane chips.

WannaBeOCer 12-11-2019 01:50 PM

This is nothing new, they've been chanting about being a data company since 2017 and their innovations in storage show exactly that.

opt33 12-11-2019 02:38 PM

yep, given it is inevitable that intel will lose a significant portion of cpu market share over next couple years because of prior mistakes, the new intel CEO is smartly resetting stockholder expectations with a positive spin by concentrating on segments where the company has growth potential. Understandably, he doesnt want to be blamed for what is going to happen to intels cpu market share in near future, nor should he be.

speed_demon 12-11-2019 02:45 PM

That's not how shareholders are going to see things. Investors want to see their money growing in one form or another, with short term set backs of under 5 years being okay as long as the long term horizon means they keep gaining value relative to their purchase price. So the question on the shareholder's mind then becomes "Did I invest in a chip producer or something else entirely?"

rluker5 12-11-2019 07:13 PM

Quote:

Originally Posted by Kurumi Tokisaki (Post 28233608)
https://wccftech.com/intel-ceo-beyond-cpu-7nm-more/

Interesting read. Sounds like Intel is throwing in the towel on the CPU market. Unless I am missing something.

That would be like saying AMD is throwing in the towel on Zen if they came out with Big Navi.
Swan is saying that the cpu market alone is no longer big enough for Intel and they want more business than just that market has to offer. You have to specifically read his quote to get this. Intel is also making Optane now, geting into the gpu market, and making more custom silicon stuff.

AMD is definitely more competitive in the cpu market now, so maybe Swan is trying to save face, but he is not throwing in the towel for the cpu market.

RichKnecht 12-11-2019 07:19 PM

With the new AMD chips coming out soon, Intel is in for a world of hurt. I'm still an Intel fan, but you can't look away from AMD's progress and performance. I'm not moving to AMD as I have too much invested in X299, but I am sure there will be many who make the switch and there will be a lot of used Intel HEDT chips for sale.

Lexi is Dumb 12-11-2019 11:25 PM

maybe im missing something but it sounds to me more like "the amount we'd need to invest to maintain an extra 40% marketshare over whats actually necessary in the face of competition.. could be better invested and have a better ROI elsewhere." They just halved their HEDT lineup prices, if they have to continue to do that sort of thing to maintain a high marketshare, they're still making much less than they ever did before, and after a certain point its probably not worthwhile. They want a 90% marketshare at their usual profits, not a 90% marketshare at AMDs measly profits and I guarantee they'd take a 30-40% marketshare at their high profits over it, no more production constraints and all the other pros that comes with like the ability to focus instead of trying to supply everyone everything. I doubt its ever been their primary goal to compete with TSMC as a fab, and having to maintain being the 2nd largest chipmaker because of not just all their other industries but because they're supplying 90% of the worlds desktop/laptop and server CPUs has probably been insanely difficult.. I'd say they probably wanna focus more of their manufacturing strength on stuff like 5G right now. Hot take but the lack of response to AMDs competition on the desktop might be because it's actually given them some relief.

Don't mistake their investment into GPUs as one to compete with AMD and Nvidia for the general consumer markets. They want to be a unified solution for enterprise. CPU, GPU, Storage.. the whole lot and they are 1000% better off investing in that than going out of their way to take the entirety of a relatively smaller battleground.

UltraMega 12-12-2019 12:06 AM

It makes sense. For most consumers, desktop CPU advancements haven't really mattered in about a decade. Once CPUs are good enough, making them better doesn't really matter outside of gaming for consumer level tasks generally speaking.

drewafx 12-12-2019 12:14 AM

Nothing wrong with focusing all energy on CPU and being the best
Problem is Intel got too complacent and no longer in that position anymore, so they have to diversify...
Jack of all trades and master of none? I surely hope it doesn't go that way
WiDi and optical Thunderbolt seemed like good features but eventually didn't get to mass adoption

I'd say just keep working on making computer fast and responsive with no compromise
At the end of the day I chose Intel CPU bought Intel chipset motherboard because it was the fastest available at the time
Latency is crucial for anything that needs input/output synchronization without stutters music/gaming professionals/enthusiasts
Raw performance for heavy workloads, power efficiency for battery devices/data centers
If they can get the hardware cutting edge for all price points, whether it's windows, mac, or linux they will have customers
Let start ups take care of feature products naturally born out of necessity or let Apple make it cool first

I think dedicated expansion devices > bundled into CPU & chipset, at least for normal desktop computers
Mac Pro? ridiculously expensive but hell yeah
NUC? ok but meh, tons of problems. USB stick PC? bleh nice nerdy experiment.

Awsan 12-12-2019 04:00 AM

Its an interesting read, but Nokia was once an unstoppable juggernaut that was brought down by a trojan "Conspiracy theories".

But still its hard for Intel to fall anytime soon, They will struggle for a couple of years {5 Years max???} and I surely hope AMD will take advantage in this time and supercharge the CPU game.

Hwgeek 12-12-2019 04:04 AM

https://twitter.com/CDemerjian/statu...77861070860288
Quote:

What I found floored me. I am now seriously worried about Intel's survival, something I don't say lightly. It is that bad. Story tomorrow morning, take this one seriously.

KyadCK 12-12-2019 06:05 AM

Good mentality to take if I am being honest.

They did lose out on a lot of things such as mobile CPUs and GPUs by simply targeting them too late. As they mentioned, due to mistakes, over confidence, and frankly a Series of Unfortunate Events from their prospective, they dropped the ball, hard, for the next several years.

It is not possible for them to catch EPYC anymore as they are now. By the time 7nm is ready for servers, AMD will be on 5nm, and they will be on equal footing like when EPYC 1 launched. AMD will be making, probably, 96-core CPUs by then and already be on PCI-e 5.0 and DDR5, which will prevent them from needing to over-complicate motherboards further. If Intel tries to make monolithic CPUs that can compete with that, we will be in the exact same situation we are in now.

Intel needs chiplet tech of their own. If they are going to design chiplets too, be it 2D, 2.5D or 3D stacking, and they make Optane, GPUs, FPGAs, modems, etc, then it makes perfect sense for Intel to begin doing what AMD is doing; semi-custom design, and more general flexibility.

More importantly, Intel needs to actually fight ARM, and nVidia/AMD's GPU divisions, and they know it.

If the CEO is being honest, then this new mentality is required to make everyone focus on all of their rivals, and is required for them to wake up.

Quote:

Originally Posted by UltraMega (Post 28234226)
It makes sense. For most consumers, desktop CPU advancements haven't really mattered in about a decade. Once CPUs are good enough, making them better doesn't really matter outside of gaming for consumer level tasks generally speaking.

The thing about building a more efficient design, is that if you do not need more performance, you can instead clock it down, make it smaller, use less power, easier to cool, and live longer.

Even burst speed, when tuned correctly, can be more efficient for lower powered systems over all.

Kurumi Tokisaki 12-12-2019 06:24 AM

I want to thank everyone who posted something thoughtful. It's discussions like these that make me want to participate here.

warr10r 12-12-2019 06:54 AM

At least they aren't outright denying that there's a problem a la Blackberry and Nokia.

AMD basically hit them incredibly hard where it hurts, which for Intel is their CPU marketshare, lol. The CEO here is showing his "No, no, we're fine. See?!" face and trying to walk it off but showing visible discomfort...

I don't doubt that Intel will bounce back like a freaking tiger in a few years but damn they are hurting in the short term.

Awsan 12-12-2019 07:02 AM

Quote:

Originally Posted by warr10r (Post 28234504)
At least they aren't outright denying that there's a problem a la Blackberry and Nokia.

AMD basically hit them incredibly hard where it hurts, which for Intel is their CPU marketshare, lol. The CEO here is showing his "No, no, we're fine. See?!" face and trying to walk it off but showing visible discomfort...

I don't doubt that Intel will bounce back like a freaking tiger in a few years but damn they are hurting in the short term.

No lesson is free, The funny part is, Nvidia doing the same thing? (Although I thought they learned their lesson in When AMD released the 7xxx series and the r9 2xx series)

Kurumi Tokisaki 12-12-2019 09:46 AM

Quote:

Originally Posted by warr10r (Post 28234504)
At least they aren't outright denying that there's a problem a la Blackberry and Nokia.

AMD basically hit them incredibly hard where it hurts, which for Intel is their CPU marketshare, lol. The CEO here is showing his "No, no, we're fine. See?!" face and trying to walk it off but showing visible discomfort...

I don't doubt that Intel will bounce back like a freaking tiger in a few years but damn they are hurting in the short term.

I agree with you. They are still incredibly diverse and profitable overall regardless of their struggles in the CPU market.

They just need to get their poop in a group in the CPU arena. They just hired Gary Patton away from Global Foundries. Maybe he can help them get their fab woes in order.

https://www.anandtech.com/show/15226...es-gary-patton

Now they need to hire someone who can help prevent a new exploit every time they add a feature.

:)

UltraMega 12-12-2019 12:04 PM

Quote:


The thing about building a more efficient design, is that if you do not need more performance, you can instead clock it down, make it smaller, use less power, easier to cool, and live longer.

Even burst speed, when tuned correctly, can be more efficient for lower powered systems over all.
Obviously.

My point is that unless you need longer laptop battery life, nothing about CPU advancements in the last decade really affect most people since most people only do simple tasks.

Imouto 12-12-2019 01:20 PM

It isn't AMD they can't compete with, it is TSMC and Samsung. TSMC has 5 nm on track for 1H next year and Samsung can dump memory into the market like there is no tomorrow.

When you are as big as Intel and with such a big portfolio you leave a lot of fronts open and they don't have the ability to turn around because that would leave something else exposed. There are a lot of companies looking to snag some market from Intel like Qualcomm or Apple. TSMC, Samsung and AMD were just the first ones to realize the emperor is naked. The only way they can survive this is specializing in one area and giving up on everything else.

You can't be the best at everything forever. You can't even be the best at one thing forever.

CDub07 12-12-2019 01:37 PM

Quote:

Originally Posted by Imouto (Post 28235040)
It isn't AMD they can't compete with, it is TSMC and Samsung. TSMC has 5 nm on track for 1H next year and Samsung can dump memory into the market like there is no tomorrow.

When you are as big as Intel and with such a big portfolio you leave a lot of fronts open and they don't have the ability to turn around because that would leave something else exposed. There are a lot of companies looking to snag some market from Intel like Qualcomm or Apple. TSMC, Samsung and AMD were just the first ones to realize the emperor is naked. The only way they can survive this is specializing in one area and giving up on everything else.

You can't be the best at everything forever. You can't even be the best at one thing forever.

Well said!!!! To me that is the one of the biggest downfalls I don't understand about the stock market. Unless you have your head on a swivel someone hungrier is always around the corner is waiting for there chance to become king of the monuntain. The bubble of extreme profits one day will pop. I guess make as much money as you can and get out ahead or fall like a giant or always stay innovating to stay relevant.

KyadCK 12-12-2019 03:58 PM

Quote:

Originally Posted by UltraMega (Post 28234916)
Obviously.

My point is that unless you need longer laptop battery life, nothing about CPU advancements in the last decade really affect most people since most people only do simple tasks.

That's when you make it smaller (IE less material, cheaper), use less power so it puts off less heat so you can advertise being Eco friendly and/or silent, as well as give it a longer warranty, and then charge more for being "better".

You aren't being creative enough.

It isn't just that laptops now last 3x longer than they did back then; Look at the form factors and weight, or TVs that can have them built in and still be sleak, look at AIO PCs from then and now. There are Thin Clients where it's an AIO that can be entirely powered by PoE, even the 1080p monitor!

skupples 12-12-2019 05:39 PM

can confirm, we provide the general population with i3 16gb 512gb touch screen HP 360s. Only the execs and top managers get better equipment(i7s), and that's only because they tend to want 10 keys and bigger screens.

hell, most of the devs are on ancient i5 lenovo laptops.

rluker5 12-13-2019 06:42 AM

Those low powered dual cores are surprisingly adequate for most common uses. Even the core m series all the way back to Broadwell. (at least the full 6w versions)
If you aren't gaming or doing some niche, high powered compute, why would you replace something that works well? The igpus have been more than adequate for non gaming for quite some time as well.

From a measure of click to completion of most common tasks, performance has been stagnant for quite a while and as the market saturates sales should stagnate as well.
For example, I picked up this thing: https://www.userbenchmark.com/UserRun/20063399
for $145 off of ebay. It's a fanless tablet from 2015.

Intel has to come up with more than just a slightly faster cpu for the plebs asking "so how will this new model be better for me?" For them upgrading a cpu is a lot like upgrading their ssd.

umeng2002 12-13-2019 07:38 AM

Which reminds me, where the hell are all the Ryzen laptops?

I was fixing my brothers laptop that has an i5-7200u and thought it was laughably slow... but then I took out the 5400 rpm HDD and put in an SSD, and it's like night and day. Intel and AMD should band together and force laptop makers to stop using HDD as boot devices.

But the arrogance of Intel is legendary... as soon as they don't have the best CPUs anymore, they act like the CPU market is too small for them... meanwhile they get "security analysts" and Ryan Shrout to do hit pieces on AMD.

Imouto 12-13-2019 09:40 AM

Quote:

Originally Posted by umeng2002 (Post 28236016)
I was fixing my brothers laptop that has an i5-7200u and thought it was laughably slow... but then I took out the 5400 rpm HDD and put in an SSD, and it's like night and day. Intel and AMD should band together and force laptop makers to stop using HDD as boot devices.

It is because of Windows. I did the same this week on a relative's laptop and it was night and day but I still remember it performing alright when I bought it for them. First I tried doing a fresh install but it was barely any faster and only a fresh install on the SSD fixed it. Out of curiosity installed Manjaro on the 5400 rpm drive and it was pretty decent.

Hell, when I was moving their stuff out the old HDD on a Manjaro live USB it was faster than Windows on the 5400 rpm HDD.

skupples 12-13-2019 09:59 AM

Only low end mobility comes with a spinny disk these days, including cheap high capacity options. 99% of our HP stuff comes with m.2, nvme, or SATA SSD.

bluechris 12-13-2019 10:39 AM

I don't think AMD will grow so fast in server area and in specific in virtualized environments since this is the biggest market share atm and constantly grows.
The reason is that no matter the size of the company you have invested a ton of money and you have Intel stuff. Because of this if you suddenly throw in the server room amd servers you loose automations like vmotion, HA etc. Its to much money for companies to replace everything with amd hardware. It needs as i see it 2-3 years at least for AMD to make a solid base.

Offcourse there are monster companies like Google or AWS that have custom virtualized software but they need time also but not so much.

I see a huge grow in desktop, laptop and workstations and slowly rise in the server area. If Intel doesn't do something big in the next 2 years then im afraid the ship will sail.

Redwoodz 12-14-2019 10:52 PM

Quote:

Originally Posted by Imouto (Post 28235040)
It isn't AMD they can't compete with, it is TSMC and Samsung. TSMC has 5 nm on track for 1H next year and Samsung can dump memory into the market like there is no tomorrow.

When you are as big as Intel and with such a big portfolio you leave a lot of fronts open and they don't have the ability to turn around because that would leave something else exposed. There are a lot of companies looking to snag some market from Intel like Qualcomm or Apple. TSMC, Samsung and AMD were just the first ones to realize the emperor is naked. The only way they can survive this is specializing in one area and giving up on everything else.

You can't be the best at everything forever. You can't even be the best at one thing forever.


I will remind you that it was AMD who gathered the troops and commanded the war. Selling GF, and then enlisting IBM and Samsung to work with TSMC&GF, all to bring down Intel process advantage. This was no accident and AMD deserves the credit for it was AMD who barely survived the change.

TSMC is not solely responsible for their 14nm/12nm/7nm designs it was multi-corporation.

moonbogg 12-15-2019 12:15 AM

Let Intel sink. If it wasn't for AMD we'd be stuck with maybe 6 core i7's for $500 and quad core i5's for $300 without HT.

Hwgeek 12-15-2019 12:49 AM

IMO Intel gonna loose market share in CPU related markets and they know it, so they will enter other markets to maintain the profits, why sell 1 CPU for 4~8 GPU's rack when they can sell many super expensive GPU's? if they can compete in HPC GPU market that will make them a tone of $$$.

Also you are right- as big you are - the bigger the fall since you have such high expanses- you cant afford even a little drop in sales.

bluechris 12-15-2019 12:52 AM

Quote:

Originally Posted by moonbogg (Post 28238352)
Let Intel sink. If it wasn't for AMD we'd be stuck with maybe 6 core i7's for $500 and quad core i5's for $300 without HT.

Even if i am with you and i had build the last month 1 ryzen for my homelab with esxi and 2 in work in our server room i say this is not good if it happens.
We don't need as consumers Intel to be destroyed even if they deserve it because the luck of competition will make AMD the new Intel. We need Intel to be hurt and to put their heads down, devote resources to R&D and bring good products in 2-3 years in the market.
This will be a win win for us and the competition will stay alive in our favor. Monopolise things is bad for consumers in everything.

umeng2002 12-15-2019 02:01 AM

Let them sink but not disappear for 5 years like AMD did.

We need market competition... Intel has been Intel because of no competition from AMD or others.

nVidia is getting almost as bad as Intel, but at least AMD can release the bare minimum of GPUs to make nVidia seem like they actually cares about value and progress.

rdr09 01-09-2020 01:39 PM

Intel needs to sort out its 10 nm problems.

https://www.tomshardware.com/news/in...too-much-power

WannaBeOCer 01-09-2020 01:46 PM

Quote:

Originally Posted by rdr09 (Post 28275326)
Intel needs to sort out its 10 nm problems.

https://www.tomshardware.com/news/in...too-much-power

We'll see when it's actually launched.

14nm: 6700K @ 4.2Ghz uses 112w
14nm++: 9900K @ 4.2Ghz uses 95w

rdr09 01-09-2020 01:51 PM

Quote:

Originally Posted by WannaBeOCer (Post 28275342)
We'll see when it's actually launched.

14nm: 6700K @ 4.2Ghz uses 112w
14nm++: 9900K @ 4.2Ghz uses 95w

A Cooler Master 212 should suffice to cool a 95W CPU.

EniGma1987 01-09-2020 02:08 PM

Quote:

Originally Posted by WannaBeOCer (Post 28275342)
We'll see when it's actually launched.

14nm: 6700K @ 4.2Ghz uses 112w
14nm++: 9900K @ 4.2Ghz uses 95w


Where do you get that data from and why did you choose 4.2GHz? The 9900K doesnt run at 4.2GHz by default in any configuration, and at stock it already draws 160+ when you load all the cores up doing stuff.

WannaBeOCer 01-09-2020 02:16 PM

Quote:

Originally Posted by EniGma1987 (Post 28275374)
Where do you get that data from and why did you choose 4.2GHz? The 9900K doesnt run at 4.2GHz by default in any configuration, and at stock it already draws 160+ when you load all the cores up doing stuff.

It does run at 4.2Ghz when MCE is disabled. 4.2Ghz is the default with MCE on the 6700k.

https://www.gamersnexus.net/guides/3...-duration-z390

Quote:

For reference, here’s what the frequency functionality looks like when operating under settings with MCE disabled via XMP II and the “No” prompt. This plot takes place over a 23-minute Blender render, so this is a real-world use case of the i9-9900K. Notice that there is a sharp fall-off after about 30 seconds, where average all-core frequency plummets from 4.7GHz to 4.2GHz. Notice that the power consumption remains at almost exactly 95W for the entire test, pegged to 94.9W and relatively flat. This corresponds to the right axis, with frequency on the left. This feels like an RTX flashback, where we’re completely power-limited. The difference is that this is under the specification, despite the CPU clearly being capable of more. You’ll see that the frequency picks up again when the workload ends, leaving us with unchallenged idle frequencies.

EniGma1987 01-09-2020 02:57 PM

Quote:

Originally Posted by WannaBeOCer (Post 28275392)
It does run at 4.2Ghz when MCE is disabled. 4.2Ghz is the default with MCE on the 6700k.

https://www.gamersnexus.net/guides/3...-duration-z390




So it runs at 95W, because you power capped it to a 95w limit. That isnt how stock works and should be obvious behavior for any hardware when you set a power cap on.

WannaBeOCer 01-09-2020 03:05 PM

Quote:

Originally Posted by EniGma1987 (Post 28275462)
So it runs at 95W, because you power capped it to a 95w limit. That isnt how stock works and should be obvious behavior for any hardware when you set a power cap on.

It runs at 95w because it's running at Intel specifications. When did the term stock mean anything aside from factory settings?

https://www.anandtech.com/show/6214/...about-free-mhz

EniGma1987 01-09-2020 03:27 PM

Quote:

Originally Posted by WannaBeOCer (Post 28275476)
It runs at 95w because it's running at Intel specifications. When did the term stock mean anything aside from factory settings?

https://www.anandtech.com/show/6214/...about-free-mhz

When the CPU is put in a board and everything is at stock, the CPU runs at Intel's advertised turbo speeds. If you choose to go in and modify stock settings to enable power cap it will run at Intel's advertised power draw, but you no longer get Intel's advertised speeds and you are no longer running how everything comes stock.

WannaBeOCer 01-09-2020 03:50 PM

Quote:

Originally Posted by EniGma1987 (Post 28275514)
When the CPU is put in a board and everything is at stock, the CPU runs at Intel's advertised turbo speeds. If you choose to go in and modify stock settings to enable power cap it will run at Intel's advertised power draw, but you no longer get Intel's advertised speeds and you are no longer running how everything comes stock.

No where on their site do they advertise 4.7Ghz all core boost. None of the slides when they announced the 9900K show any mention of a 4.7Ghz all core boost CPU.

https://ark.intel.com/content/www/us...-5-00-ghz.html

They advertise a 5Ghz turbo boost on a single core which still happens when MCE is disabled. For example my 9900K is running in a z170 board, it never touches 4.7Ghz all core boost at stock settings with MCE disabled. I do see 5Ghz boost during single core workloads. At the end of the day the 9900k has 4 more cores than a 6700k and uses less power at the same frequency.

m4fox90 01-09-2020 06:05 PM

Quote:

Originally Posted by EniGma1987 (Post 28275462)
So it runs at 95W, because you power capped it to a 95w limit. That isnt how stock works and should be obvious behavior for any hardware when you set a power cap on.

Isn't it crazy that people are still doing Intel power draw and MCE apologism these days?

rluker5 01-09-2020 08:13 PM

1 Attachment(s)
Yeah, you would think that people would know that the power draw of MCE =/= PL1 since it never has in the existence of MCE and it's variants. The efficiency improvements of the skylake arch on 14nm have been impressive. One would expect that doubling the efficiency of an arch at a common frequency would require a node reduction, but apparently not. The 10900k should have better PL1 clocks even without further efficiency improvements due to having more power/core but it will probably also have the capability of sucking down much more power if you so choose. It would be nice if people didn't try to conflate the two.
And just to brag about Intel's power efficiency, here's a screenshot of my power usage with 14 tabs open at 1440p for the last almost hour on an old Intel based pc that browses at regular desktop speed:

Check out how much power the GT (graphics) cores need to display a video free desktop.

rdr09 01-09-2020 11:00 PM

Quote:

Originally Posted by WannaBeOCer (Post 28275476)
It runs at 95w because it's running at Intel specifications. When did the term stock mean anything aside from factory settings?

https://www.anandtech.com/show/6214/...about-free-mhz

It was mentioned that the delay has to do with optimizing the cpu power draw for its "smooth operation". IIRC, with the 9900K, intel gave motherboard manufacturers free reign on boost algorithm. Let's see if Asus pulls a fast one on other manufacturers again.

Defoler 01-10-2020 01:32 AM

Quote:

Originally Posted by Kurumi Tokisaki (Post 28233608)
Sounds like Intel is throwing in the towel on the CPU market.

Not really.

What he is saying is that they got so hang up about trying to control the whole CPU market (and trying to hold 90% of it), and they have been manufacturing CPUs like crazy, that they missed out on making the 5G modems (because they didn't have enough manufacturing for it) and other stuff, and getting stuck on not being able to manufacture enough, that they are missing on money.

If for example they produce less CPUs but able to pull resources more into modems, they could have got into the modem industry with apple, which could create a new revenue market, which for them could increase a lot more compared to the CPU market they can't fulfill.
Same goes for their upcoming GPUs.

So it makes sense to divert their resources from tunnel vision into the CPU market (especially with only having 7nm in 2021), to other markets. 30% of a 300B$ market (90B$) is better than 90% of a 66B$ market (59B$).
They are big enough to be able to divert more resources and leave some room in the CPU market for AMD and ARM, and gain more on other markets they are not present in, but just as big and allow them in the future to produce more technology they don't have today.

So it makes perfect sense overall. Especially since they lost on the 7nm and multi core run this round, and without diverting resources, it will take them at least another year in which they are losing their basically only market, without creating new markets.

mothergoose729 01-10-2020 08:46 AM

Intel is besieged on all sides from ARM, AMD, Samsung TSMC, and they want to get into the GPGPU market which is dominated by Nvidia. It is hard to imagine how they they grow significantly in size over the next ten years.

Mrzev 01-10-2020 11:17 AM

I really dont like the idea of saying "We are going to lose market share to make an even better product". It translates to AMD has been investing in the future and we cant continue to keep our lead. Its not them trying to lean out their processes and production, its them saying we are going to move more funds from the CPU division to R&D. This will slow us down, we will lose market share until the R&D returns start to pay off, and thats what market share % we will end up at.

WannaBeOCer 01-10-2020 11:24 AM

Quote:

Originally Posted by Mrzev (Post 28276824)
I really dont like the idea of saying "We are going to lose market share to make an even better product". It translates to AMD has been investing in the future and we cant continue to keep our lead. Its not them trying to lean out their processes and production, its them saying we are going to move more funds from the CPU division to R&D. This will slow us down, we will lose market share until the R&D returns start to pay off, and thats what market share % we will end up at.

This year they're going to be releasing their 10nm Ice Lake Xeon chips which are going to be released in 1H of 2020(Probably March like Cascade Lake). AMD's Milan Epyc chips won't be out until Q4 of 2020.

KyadCK 01-10-2020 12:33 PM

Quote:

Originally Posted by WannaBeOCer (Post 28276832)
This year they're going to be releasing their 10nm Ice Lake Xeon chips which are going to be released in 1H of 2020(Probably March like Cascade Lake). AMD's Milan Epyc chips won't be out until Q4 of 2020.

https://wccftech.com/intel-xeon-10nm...cpus-detailed/
Quote:

Intel Xeon 10nm+ Ice Lake-SP/AP Family

Intel Ice Lake-SP processors will be available in the third quarter of 2020 and will be based on the 10nm+ process node.
https://hothardware.com/news/intel-1...res-76-threads
Quote:

Intel is targeting a Q3 2020 launch for Ice Lake-SP.
https://hothardware.com/Image/Resize...ooper_lake.jpg

The 1H2020 information is out of date, it got delayed. Again. Provided this source is accurate of course.

Not that it matters, 38 cores at 270w is not a threat to 64 core EPYC 2, let alone EPYC 3. A 35% core bump is not enough to even the playing field. Even with a 20% IPC bump they would still need another 20% clock speed on top of that, which would have Ice Lake base clocks at ~4Ghz.

And AMD's TDP is 225w, not 270w.

WannaBeOCer 01-10-2020 12:46 PM

Quote:

Originally Posted by KyadCK (Post 28276918)
The 1H2020 information is out of date, it got delayed. Again. Provided this source is accurate of course.

Not that it matters, 38 cores at 270w is not a threat to 64 core EPYC 2, let alone EPYC 3. A 35% core bump is not enough to even the playing field. Even with a 20% IPC bump they would still need another 20% clock speed on top of that, which would have Ice Lake base clocks at ~4Ghz.

And AMD's TDP is 225w, not 270w.

Deep learning boost alone is a threat and the reason why I still purchase Xeons. I switched to Epycs for my GPU servers but Xeons are the way to go for our CPU research. Saying Xeon's aren't a threat to a 64 core Epyc is a joke.

ToTheSun! 01-10-2020 01:23 PM

I'm now realizing I'm going to enjoy this thread.

KyadCK 01-10-2020 05:28 PM

Quote:

Originally Posted by WannaBeOCer (Post 28276944)
Deep learning boost alone is a threat and the reason why I still purchase Xeons. I switched to Epycs for my GPU servers but Xeons are the way to go for our CPU research. Saying Xeon's aren't a threat to a 64 core Epyc is a joke.

Would it not make sense to just... Buy more GPUs?

One top notch Xeon cost as much as several top notch GPUs after all.

Though, again, while AI is obviously very relevant to your use case and not my forte, I would like to see numbers showing Xeon usage for AI as more than a blip on the radar in AI usage, let alone in datacenters as a whole. I literally can not find numbers on it because they all want to talk about power usage.

Imouto 01-10-2020 06:41 PM

Quote:

Originally Posted by KyadCK (Post 28277300)
Would it not make sense to just... Buy more GPUs?

One top notch Xeon cost as much as several top notch GPUs after all.

Though, again, while AI is obviously very relevant to your use case and not my forte, I would like to see numbers showing Xeon usage for AI as more than a blip on the radar in AI usage, let alone in datacenters as a whole. I literally can not find numbers on it because they all want to talk about power usage.

Not to mention that everyone and their moms are using specialized hardware for AI and machine learning. Nvidia is facing problems because using GPUs for that is as inefficient as it can get. Imagine doing that on a CPU.

mothergoose729 01-10-2020 08:50 PM

My mom still uses Xeon.

m4fox90 01-11-2020 07:55 AM

Quote:

Originally Posted by mothergoose729 (Post 28277518)
My mom still uses Xeon.

Ol' Grandma goose?


All times are GMT -7. The time now is 02:56 AM.

Powered by vBulletin® Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.