Originally Posted by mav451
To be fair, how many noobs asking for advice are pairing their build with a 1440P (or higher) res monitor?
I'd wager they are on a low resolution (1080P or 1440x900).
It's the same at 1080p, I didn't really notice the performance improvement from going to my i5 (From a FX-4170) and I game at 1080p.
As for 1440x900 and below...Well, you're not likely to see the difference in performance unless you've managed to have a refresh rate of like 240Hz or something as FPS will already be plenty high even if you have a $50 CPU.
Originally Posted by Michalius
You're right, and that's definitely the upshot of this article.
However, with caveats like this:
Followed by this gross exaggeration:
Anyone who has ever done benching knows you should run each test at least 5 times, and then either average the findings, or use *all* of it.
For the lack of proper performance measuring (as in, Frame Time) we are presented with this:
You say FCAT is too hard, and then discount Fraps, which has been proven to be plenty accurate in terms of reporting frame times on single cards. Even when it's off, it's still far more accurate than any FPS measurement, which is simply an average of frame times over a second.
So we have the writer admitting that this article is missing the most accurate measurements of performance. But then we get conclusions like this:
That can be literally translated as, "do not mind performing horribly at video game that is using two or fewer threads, and even BF3 multiplayer".
How can this sort of conclusion be drawn when the most important gaming data is completely missing?
Here's how the recommended 5600K performs with a single GPU in a typical CPU demanding game.
This is my point. You can't base conclusions on poor testing. I get the benefits of this article, especially comparing such a wide range of CPUs. But to acknowledge that the test is severely lacking due to time constraints, exaggerate on the difficulty of doing multiple runs to gather accurate data, and then finish the article with a conclusion on which processor to buy in a single GPU configuration that performs like absolute garbage (trust me, I know, I have a 5600K in my HTPC) is borderline offensive to me.
This should be relegated to a community amateur's benchmarks for some interesting data. It is precisely that. It's not an article that is worthy of actually basing consumer advice on. Doing so is disingenuous to the reader, and at odds with the whole point of journalism and consumer advice.
Anand, how did you guys okay this?
FCAT shows the differences, but it also grossly exaggerates them...I play the hell out of Skyrim, I did on my FX-4170 and I do on my i5, there's a slight
difference between a 4.2Ghz FX and a 4.5Ghz i5 in Whiterun or other busy towns, and in extremely large battles (ie. Me playing with the console and spawning dragons everywhere) but for easily 95% of the time I'm playing? No difference.
The fact of the matter is virtually every game is GPU limited all of the time, some are CPU limited at times (eg. Skyrim, Civ V as I bet those low FPS rates come from when it's calculating turns, when you're doing more than waiting for it to finish it's a smooth 60fps all around in DX11 mode for me on my rig, even when I had the FX) and a handful are CPU limited most of the time (eg. Sins of a Solar Empire, SC2)
Also, how does the 5600K perform like garbage? It's faster than a lot of CPUs that most people would consider perfectly fine for gaming, maybe it's a bit slower than Intel (~55% so if you're talking equal clocks using only one, and precisely one thread, less so as you add more threads and compare at stock clocks, etc) but it's certainly not garbage.
Originally Posted by Artikbot
Well, at 1440p with a HD7970, there simply is no difference between Intel and AMD, so... Yeah.
Even with two HD7970s the difference seems pretty small, and I can vouch for 1080p being enough to cause a GPU bottleneck quite often, although not as much as 1440p.
Originally Posted by ZealotKi11er
CPU is still important and we need faster CPU for games. The thing about CPU is that you cant test them suing GPU limitation games. Grab 2 x Titan OC @ 1080p no AA and test CPUs. I know in dome games CPU matter a lot. In BF3 MP for example. Yeah it might be 80fps vs 100fps but still one CPU is faster then the other. Also for those that want 120Hz the CPU need to be top of the time to get close to 120 fps.
If you're buying a 120Hz monitor I have no idea why you're even considering AMD.
As for the rest..I noticed a significant performance increase in BF3 MP going from a stock Core 2 Duo E6700 to an Phenom II x3 720 @ 3.2Ghz, not so much from the Phenom II to a FX-4170 clocked at 4.2Ghz and another tiny boost from going to my i5 when I did a quick test not too long ago...It is to a point but honestly, as long as your CPU was launched in or after 2010 and is a Quad Core or dual with HT then you're fine for CPU upgrades for now.
And it doesn't matter when it's 80fps vs 100fps, plus, the FX is faster elsewhere and starting to show that speed in games now...Crysis 3 is slightly (Still unnoticeable) faster on an FX-83*0 than the i5 and faster still (By 20fps, definitely noticeable) on an i7 for example.
Originally Posted by Michalius
You basically have to clock your processor at 4.5 and have something with really great single threaded performance to maintain low frame times in just about any UE3 game.
I still wonder just how noticeable the frame-times are though, since I can't tell the difference with my i5 OCed and at stock in any
game, even the CPU limited ones; they still end up lagging around the same time, probably with higher FPS but it's still a stuttery mess.
That's including UE3 games which have been all GPU limited in my experience.
Originally Posted by Dimaggio1103
So I wonder how all these people with FX CPU's are not killing themselves with frustration, because their CPU can't play any games. I mean from what i'm gathering from all these debates, the FX CPU is so weak when you boot up a game it just craps itself, right?
This is why I think frame-time testing should be done alongside
FPS testing, because the fact of the matter is that some people don't notice it...Look at the CFX FCAT stuff, if you believe a lot of those it says CFX is worse than single-cards despite giving higher FPS, yet there's a lot of people who didn't notice it.