Originally Posted by LaBestiaHumana
Love how the graph shows that 4960x at 2ghz, but not at 4.9. Lmao
Can't wait for Bencher to correct me on that.
Because all 4960x will do 4.9 on air. lol.
Seriously WE HAVE TO BE RATIONAL. 4.6 is the more realistic clock on water and even though we have the means and money to WC a whole rig doesn't mean benchers can afford them
ADD : You're still as ridiculous as ever
Originally Posted by Forceman
My concern is that you are claiming that Mantle eliminates a CPU bottleneck, and yet the performance is the same at both 2 and 3 GHz. Which would imply that there is no bottleneck. Otherwise increasing frequency should also improve performance. Just because performance is higher, doesn't necessarily mean it was because the CPU was bottlenecking. There are other optimizations that can be made to improve performance, as we see with nearly every driver update.
Edit: Actually, as I look at that graph again, I really don't get what they are trying to show. 4C/4T @ 2 GHz is the same as 2C/4T @ 3 GHz? And then they also show 6C/12T @ 4.2? All of which have roughly the same performance. Makes no sense. Doubling the frequency, adding 50% more cores and 200% more threads, and you get a measly 5% performance increase? Doesn't look like a CPU bottleneck to me.
You didn't get the point. This shows that those gaming on weak i3s or i5s can game to the full potential. Also it doesn't work that way really, a 2x higher clock =/= 2x performance. What that graph really shows is how inefficient DirectX is.
AND YES. A 7870/R9 270x IS BOTTLENECKED BY A 2500K. And it's true that BF4 uses so much computing horsepower for every single bullet that CPU usage can spike so high that FPS can take a dip with DX. That's how it is
Originally Posted by felon
You don't have an r9 gpu..they barely got the r9s under control. You are on the old gen and god knows what issues you'll have at this point. Unless you are on r7 or r9, i wouldn't even bother testing the beta drivers out
Oh so if you bought a R9 280x things will be fine? IT'S THE SAME DAMN THING AS A 7970 AS IS R7 260X AND A 7790
Originally Posted by Forceman
Running a 4960X at 3 GHz with only 2C/4T is basically making it an i3 (with the additional cache, of course). And it performs within 5% of the 6C/12T/4.2GHz version. But I think you are missing my main point, which is that just because performance is better on Mantle doesn't mean it is because Mantle is eliminating CPU bottlenecks. It could very well just be demonstrating better GPU optimizations, just like driver updates do. In some cases Mantle absolutely is eliminating a CPU bottleneck (large pop servers with APUs, for example), but your graph isn't showing that situation.
If doubling the CPU performance (roughly, 50% higher clock speed and 3 times the number of threads) only gives you a 5% performance gain, that's a pretty wide neck.
If you're just looking up in the field CPU usage won't even move much. But it really changes in a 32-player or 64-player game and while you're actually playing. i3s can turn into a slow pudge of jello mess while stock i7s probably chug along fine slightly below 60.
I didn't get much benefit because I've a 3.6GHz 12-thread CPU anyway.
Originally Posted by revro
why dont we see in mainstream the numbers for 1440p/1600p?
I had to look like idiot for mantle 1440p numbers ... found only this onehere is the link
firstname.lastname@example.orgGHz @1440p and the numbers are telling a story that makes sense why we get only 1080p results and none 1440p/1600p/3x1080p
Not everybody wants to buy a 600$ CPU and 300$+ mobo that are 1 year behind the mainsteam section that costs half that.
Originally Posted by Clocknut
Thief wont be showing much, what we need is a RTS game to show these advantages, games that actually design with Mantle+ console API as the primary API. BF4 seems isnt the one, infact even older title BF3 Single player DirectX the game show very little dependent on CPU.
I tried star swarm with a r9 280x on a 2500k and it showed a rather large difference
Originally Posted by bencher
I just ran star swarn fir the first time...
In DX I was getting 17fps during fire fight. With mantle I get 62fps
Well there you go.
Originally Posted by mboner1
If nvidia released something that increased performance by 7% for $200 you guys would be jumping out your tree to get it, AMD do it for free and it's not good enough. And that's the lower end of performance gains, we all know mantle is better than dx11 so why are you guys making yourselves look silly defending something that's worse??? Its not AMD vs nvidia, it's directx vs mantle.. And you know mantle is better. And if you don't your in denial.
Exactly. Nvidia fanboys. What we want on OCN is to get the most we can out of something. Mantle is here to do that and everyone is saying "MEHH IT'S NOTHING GOOD" It's something good and it really is. Better than some G-Sync thing that you have to pay extra for ...
And yes i checked a few weeks ago that a 290 is still only 50$ more than the 780. At least it's 10% faster clock for clock
Originally Posted by Durquavian
Actually now that I think about it, I wonder if the draw calls increased in BF4. That was the big thing with Mantle. FPS is not telling the whole story at all, albeit cool. I haven't seen any real mention of what parts of Mantle were implemented. You could be getting 30FPS more and with that 30FPS you are also getting more objects drawn to screen by some %.
It's not so good now as FPS never had really used so much CPU power but BF4 leverages Mantle because ... Well the consoles have pretty weak CPU power so they developed for the consoles using Mantle. On the desktop it's not so noticeable but how many big games are usually developed for desktop platform?
Originally Posted by felon
a higher end cpu will always provide more fps. That's not a bottleneck..last time i checked games rely on the cpu heavily..
It is a bottleneck if a higher end cpu provide more fps. But if your GPU is only a single one there is a point of buying better and not getting anything extra in return. Also are you suggesting a 4960x = definitely better fps than 4930k? that's your logic you implied