Originally Posted by sdlvx
Man, you Intel guys.
When AMD is close in performance, you cry about power consumption.
When AMD has a product that is extremely efficient for the amount of power it uses, and Intel has nothing that uses this much power that can perform overall as well as Kabini (CPU is a little slower sometimes for Kabini but GPU destroys Intel Atom GPUs), power consumption no longer matters and it's time to compare entry level i5s with 50w+ TDP to 15w TDP Kabini and declare Kabini sucks.
Kabini is only really comparable to Bay Trail Atoms for CPU and comparable to a big (but not Iris Pro) Intel GPU (but it actually has drivers that work). That's it, end of discussion. It's not that hard. You buy this chip because you want to have something that uses 2w idle and 15w under load that you leave on 24/7 and is capable of playing games if you need it to.
It's funny because AMD has always been ahead in this area but never gets the kudos for it.
The Atom was an unusable piece of junk IMO, I have an Atom 330 and it is the only recent CPU that I've ever owned to be pegged by basic web browsing
..It's also far slower than anything underclocked to lower clock speeds (Even a 800Mhz Phenom II dual core versus an 1.8Ghz Atom has the Phenom II being usable with minor lag and the Atom lagging on very basic tasks) and than what its IPC would indicate. Bobcat was way faster to use and now Kabini continues the legacy of generally being a far better choice than the Atom, although at least now Intel has more available ULV IB/Haswell dual cores as an option.
Originally Posted by GorbazTheDragon
Ok, if you are very short on money, yes... Sure this could be an option for 720p gaming on things like LoL or some older titles.
But, really? If I was too broke to afford a 600 dollar gaming rig I'd just not buy one at all and actually make some money.
These thins are just not for gaming.
These would be great for a gaming rig to start off if you're not earning money yet. Get one, use it as an SoC for now until you can afford a $200-$250 dGPU to throw in it and then start saving for an i3 based motherboard/CPU combo. You'll spend a little more versus saving up but if the time difference between having one of these versus say, an i3 + GTX 750Ti based rig is 6 months to a year I'd personally sure as hell think it'd be worth it. If I had kids under the age of 12 I'd certainly be starting them off with one of these, you could easily play a tonne of games on one of these...Not everyone has a boner over this years generic FPS like BF4.
You're also vastly underestimating the GPUs power and vastly overestimating how much CPU power gaming needs, these would be fine for most games at 720p especially if you're coming from the previous generation of consoles and are therefore used to 30fps...Plus, Mantle will benefit these types of machines greatly given that they have a semi-decent iGPU but fairly average CPU.
Originally Posted by CynicalUnicorn
It's K10-based too like Athlon, Turion, and Phenom IIs. This is similar to K8.
...So is K10? It's literally just an updated K8 as is this. Most of the upgrades were to make it a true quad core (Hence why the first Phenom was disappointing) and then K10.5 (Phenom II) was mainly to allow it to clock higher than 3.2Ghz or so typically.
Originally Posted by Themisseble
The biggest problem is that jaguar 4x core is better that pentium in games that use 4 cores... go and try BF4 DX11OC your i5 to 4.0Ghz and disable 2 cores - it sucks (min fps sucks)
4 cores give you much better stability even if cpu is weaker... thats why they use 8 core jaguar - more cores with great low level API means best P/W and smooth gameplay
i mean with mantle 6 core jaguar should be able to run Bf4 on ultra
This. A lot of people still think dual cores are fine for gaming...They are in a lot of games but the writing is on the wall for them. I remember playing BF3 on a Core 2 Duo E6700 @ stock, even a Phenom II x3 720 @ stock (Very similar in per core speed as the C2D, just with an additional core) was ridiculously smoother to the point where even the upgrade from a GTX 470 to a HD7950 felt like less of an overall upgrade even on a much faster CPU.