Overclock.net - An Overclocking Community

Overclock.net - An Overclocking Community (https://www.overclock.net/forum/)
-   Hardware News (https://www.overclock.net/forum/225-hardware-news/)
-   -   [WCCFTECH] Intel CEO Wants To Destroy The Thinking About Having 90% Share In CPU Market, Talks 10nm Problems, 7nm Roadmap And More (https://www.overclock.net/forum/225-hardware-news/1738282-wccftech-intel-ceo-wants-destroy-thinking-about-having-90-share-cpu-market-talks-10nm-problems-7nm-roadmap-more.html)

WannaBeOCer 01-10-2020 11:24 AM

Quote:

Originally Posted by Mrzev (Post 28276824)
I really dont like the idea of saying "We are going to lose market share to make an even better product". It translates to AMD has been investing in the future and we cant continue to keep our lead. Its not them trying to lean out their processes and production, its them saying we are going to move more funds from the CPU division to R&D. This will slow us down, we will lose market share until the R&D returns start to pay off, and thats what market share % we will end up at.

This year they're going to be releasing their 10nm Ice Lake Xeon chips which are going to be released in 1H of 2020(Probably March like Cascade Lake). AMD's Milan Epyc chips won't be out until Q4 of 2020.

KyadCK 01-10-2020 12:33 PM

Quote:

Originally Posted by WannaBeOCer (Post 28276832)
This year they're going to be releasing their 10nm Ice Lake Xeon chips which are going to be released in 1H of 2020(Probably March like Cascade Lake). AMD's Milan Epyc chips won't be out until Q4 of 2020.

https://wccftech.com/intel-xeon-10nm...cpus-detailed/
Quote:

Intel Xeon 10nm+ Ice Lake-SP/AP Family

Intel Ice Lake-SP processors will be available in the third quarter of 2020 and will be based on the 10nm+ process node.
https://hothardware.com/news/intel-1...res-76-threads
Quote:

Intel is targeting a Q3 2020 launch for Ice Lake-SP.
https://hothardware.com/Image/Resize...ooper_lake.jpg

The 1H2020 information is out of date, it got delayed. Again. Provided this source is accurate of course.

Not that it matters, 38 cores at 270w is not a threat to 64 core EPYC 2, let alone EPYC 3. A 35% core bump is not enough to even the playing field. Even with a 20% IPC bump they would still need another 20% clock speed on top of that, which would have Ice Lake base clocks at ~4Ghz.

And AMD's TDP is 225w, not 270w.

WannaBeOCer 01-10-2020 12:46 PM

Quote:

Originally Posted by KyadCK (Post 28276918)
The 1H2020 information is out of date, it got delayed. Again. Provided this source is accurate of course.

Not that it matters, 38 cores at 270w is not a threat to 64 core EPYC 2, let alone EPYC 3. A 35% core bump is not enough to even the playing field. Even with a 20% IPC bump they would still need another 20% clock speed on top of that, which would have Ice Lake base clocks at ~4Ghz.

And AMD's TDP is 225w, not 270w.

Deep learning boost alone is a threat and the reason why I still purchase Xeons. I switched to Epycs for my GPU servers but Xeons are the way to go for our CPU research. Saying Xeon's aren't a threat to a 64 core Epyc is a joke.

ToTheSun! 01-10-2020 01:23 PM

I'm now realizing I'm going to enjoy this thread.

KyadCK 01-10-2020 05:28 PM

Quote:

Originally Posted by WannaBeOCer (Post 28276944)
Deep learning boost alone is a threat and the reason why I still purchase Xeons. I switched to Epycs for my GPU servers but Xeons are the way to go for our CPU research. Saying Xeon's aren't a threat to a 64 core Epyc is a joke.

Would it not make sense to just... Buy more GPUs?

One top notch Xeon cost as much as several top notch GPUs after all.

Though, again, while AI is obviously very relevant to your use case and not my forte, I would like to see numbers showing Xeon usage for AI as more than a blip on the radar in AI usage, let alone in datacenters as a whole. I literally can not find numbers on it because they all want to talk about power usage.

Imouto 01-10-2020 06:41 PM

Quote:

Originally Posted by KyadCK (Post 28277300)
Would it not make sense to just... Buy more GPUs?

One top notch Xeon cost as much as several top notch GPUs after all.

Though, again, while AI is obviously very relevant to your use case and not my forte, I would like to see numbers showing Xeon usage for AI as more than a blip on the radar in AI usage, let alone in datacenters as a whole. I literally can not find numbers on it because they all want to talk about power usage.

Not to mention that everyone and their moms are using specialized hardware for AI and machine learning. Nvidia is facing problems because using GPUs for that is as inefficient as it can get. Imagine doing that on a CPU.

mothergoose729 01-10-2020 08:50 PM

My mom still uses Xeon.

m4fox90 01-11-2020 07:55 AM

Quote:

Originally Posted by mothergoose729 (Post 28277518)
My mom still uses Xeon.

Ol' Grandma goose?


All times are GMT -7. The time now is 03:46 AM.

Powered by vBulletin® Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.