Originally Posted by Brutuz
Until you're trying to point out that an FX-8350 is virtually neck and neck with an i5 3570k in most games, then people start pointing it out like mad.
1680x1050, 1080p, 1440p and a few Eyefinity results are all anyone really needs to test...Maybe 1366x768 but that's mainly laptops these days anyway, and if you can't max out a game at that resolution you're attempting to run it on a non-gaming rig anyway.
Welcome to comparing gaming performance for CPUs, I doubt that virtually everyone here could tell the difference between Intel and AMD in a blind test
How? You're not going to notice it in these games at 1080p, 1680x1050, 1440p, etc and game engines sure as hell aren't going to perform the same or even similar by the time current CPUs become a bottleneck to the point where they limit the game to below 60fps bar a few CPU heavy titles.
Comparing low-resolution for "accurate" CPU performance is dumb, games and their engines evolve fast enough so that results from even 2 years ago will be irrelevant to performance today as you'll either already be well and truly above 60fps, or GPU bottlenecked for the most part. (Look at how well BF3 and Crysis 3 use the CPU compared to Bulletstorm, TDU2, Dead Space 2, DA2, Homefront, etc. This is going to continue, by the time an FX-8350 or an i5 3570k actually bottlenecks the GPU in your average game I'd wager that the FX-8350 would be a heck of a lot closer to the i5 than it is today, much like the E8400 vs Q6600.)
I'd say you'd be clueless if you think 1024x768 results have anything at all to do with the real world...What are you learning from that? Nothing. Absolutely nothing. CPU performance does not matter
in gaming when you're talking recent CPUs for the most part, and by the time the FX-8350 and i5 3570k are old enough to bottleneck GPUs in most games to the point where you notice it, game engines will have evolved to the point where previous results are completely irrelevant...This happened to Athlon64 x2/Pentium D owners and Core 2 Quad owners, their chips were slower due to clocking lower (Generally) when you did the low-resolution test, but by the time they started to show their age in games the faster at launch single cores/dual cores for C2Q were then slower because games started to use the CPU more heavily and in a more multi-threaded way. It's just like SuperPi, sure, you can use it to compare performance...Just don't expect it to be even slightly relevant to the real world, sadly, too many people do think it is.
This is to everyone who goes on about GPU bottlenecking..well, derp, that's true of virtually every game on the market...Why would you buy an i5 3570k over an FX-8350 based off of gaming results at 1024x768? Those results are irrelevant, the difference they show won't make any difference now and likely won't make that much of a difference in the future, and you can see that same difference elsewhere in single-threaded results.
Welcome to comparing CPU performance in gaming, the difference between any post-2008 chip is very small indeed, and I really doubt most people here would notice in a blind test.
It should be "Just pick any random socketed CPU, try to get more threads if you plan to keep it for more than 3 or so years but other than that the differences are really minor."