Overclock.net banner

1 - 8 of 8 Posts

·
Registered
Joined
·
108 Posts
Discussion Starter #1
The specs show it runs at 2.20GHz base with a 3.2GHz turbo frequency all for a TDP of 15W. That 15W is of course stated in the name, the -U suffix is for that TDP, -Y for ~9W parts, and others for higher wattages and/or feature sets. In short while this dual core CPU doesn’t clock that high, it also doesn’t pull much power, about what you would expect. That isn’t bad until you notice a few other bits, first off that the i3-8121U doesn’t have a GPU, it is sold as a 2+0 in the parlance (CPU cores + GPU clusters), something Intel has never done before.

If you look at the two 8th gen parts that bracket the 8121U, you will see that the i3-8109U and i3-8130U are both 14nm chips from the Coffee and Kaby Lake, respectively, families. Both are 2C/4T parts with a GPU although Intel won’t give out any details about what the GPUs are any more. In any case they are at least a 2+1 configuration. What is interesting is that the 8130U runs at 2.2/3.4GHz (Base clock/Turbo clock) but the 8109U is at 3.0/3.6GHz because of a nominal 28W TDP.

Let us recap, a 14nm CPU with a GPU is faster within a 15W cap than a 10nm CPU without a GPU running at the same 15W.
https://www.semiaccurate.com/2018/05/29/is-intels-upcoming-10nm-launch-real-or-a-pr-stunt/

Interesting article out from SemiAccurate. Looks like Intel's 10nm is really not working well, despite recent releases. Thoughts?
 

·
Registered
Joined
·
450 Posts
If its not a full desktop chip being pumped up by roids while being frozen by a reefer unit sucking up 1,000 watts then its not proper chip to read up on.
 

·
Super Moderator
Joined
·
9,192 Posts
Remember Core M? How Broadwell and 14nm had an extremely limited launch at the end of 2014? Every piece of information I've seen about 10nm is pointing to a more extreme case of the same thing. I just hope the fab is good enough to compete with everyone else and their 7nm processes in the works. Worst-case for Intel is that it's as good as 14nm in all metrics but density, and they'll still have their own in-house production and won't have to pay or license the tech from a third-party.

It's also worth pointing out that the metric Charlie is using here, TDP, is pretty worthless beyond estimating power consumption. U processors are always 15W or 28W depending on the configuration, and the boost clocks are not running 24/7. A 10nm CPU should have more thermal headroom and be able to boost longer, especially at a somewhat lower clockspeed, than the 14nm CPU in an ideal situation. That may not actually be the case, but we need to wait for people to have the silicon in hand before making a judgment. It's provable right now though that allowing higher power draw results in measurably increased performance because the CPUs and integrated graphics can run at high frequencies for longer. 15W and 28W CPUs with Iris graphics are notable examples, with the latter performing around 13% better overall despite barely improved GPU clocks in graphical tests. The GPU is simply less aggressive when thermal throttling because the system may use 13W more.

Personally, I'm looking forward to the Crimson Canyon NUC using the i3. :thumb:
 

·
Volt addicted OC fanatic
Joined
·
582 Posts
It looks like GloFo and Samsung's AMD output might practically beat Intel to the next process node. (Their 12nm, due to the different way the fabs AMD uses and Intel's count nanometers for process nodes, is a lot less exciting, it's about the same as Intel's.) Things are going to get interesting when AMD laptops and servers are overwhelmingly more powerful and low wattage than their Intel counterparts, not to mention the possibility of 5GHz with lots of cores on the mainstream desktop as GloFo teased without it being a fire-breathing Piledriver mistake. Of course, GloFo has over-promised before....
 

·
Registered
Joined
·
1,713 Posts
Its a mobility low power product designed for application control/automation. Basically a PLC head. If combined with an ARM cluster GPU in a mobility device it will allow for Vulkan on intel based mobility systems.

This proof-of-concept is the same approach intel used for the last major changes to low power architecture that ended up in the Surface Pro.

It is entirely likely that the reason it's a 2+0 is simply so that companies can use whatever graphics component *they* want to install in their hardware.

This is *not* a user installable part.
 

·
Registered
Joined
·
1,746 Posts
i swear every time intel does something there some article about how theyre gonna fail etc. and.... it never turns out that way. the only exception was pentium 4
 

·
Registered
Joined
·
6,188 Posts
Man, with AMD stock pumping as of the recent week. Intel must be having big issues. I think they'll reach $20 soon.
 
1 - 8 of 8 Posts
Top