Overclock.net banner

Battlefield 1 CPU performance at 144hz - CPU bottleneck

27K views 176 replies 50 participants last post by  Mini0510 
#1 ·
Hey Guys,

Google has failed me, so I'm hoping someone here can help.

I have a fairly beefy system running a GTX 1080 and a core i5 6500 (non k), 16gb Ram.

When playing BF1 at 1440p ultra settings, I get around 70 fps with considerable dips during chaotic firefights. as I have a 144hz monitor, I really want to get more out of it, but I found that even dropping down to 1080p high barely sees an improvement, with an increase to 80 or so frames per second.

Smells a lot like a CPU bottleneck .

This build was the first time in my life I didn't go with a K series CPU. Kicking myself a little, and trying to figure out if this really is a CPU bottleneck.

Has anyone come across any specific high frame rate testing for BF1 that compares different CPUs and clock speeds?

I'm considering picking up a 6600k so I can get 4.2ghz + which I am hoping should get me a decent improvement on my average and maximum frame rate.

I will at some point get a second 1080, but wasn't planning to do that until early next year at best.

TL:DR, will my 1080 see gains from an increased CPU clock to allow for better frame rates beyond 60fps?
 
See less See more
#3 ·
Quote:
Originally Posted by welshmouse View Post

Hey Guys,

Google has failed me, so I'm hoping someone here can help.

I have a fairly beefy system running a GTX 1080 and a core i5 6500 (non k), 16gb Ram.

When playing BF1 at 1440p ultra settings, I get around 70 fps with considerable dips during chaotic firefights. as I have a 144hz monitor, I really want to get more out of it, but I found that even dropping down to 1080p high barely sees an improvement, with an increase to 80 or so frames per second.

Smells a lot like a CPU bottleneck .

This build was the first time in my life I didn't go with a K series CPU. Kicking myself a little, and trying to figure out if this really is a CPU bottleneck.

Has anyone come across any specific high frame rate testing for BF1 that compares different CPUs and clock speeds?

I'm considering picking up a 6600k so I can get 4.2ghz + which I am hoping should get me a decent improvement on my average and maximum frame rate.

I will at some point get a second 1080, but wasn't planning to do that until early next year at best.

TL:DR, will my 1080 see gains from an increased CPU clock to allow for better frame rates beyond 60fps?
I guess ultra 1440p is why you get 70 fps, the game is kind of optimised to take advantage from both the cpu and the gpu but they didnt put a goal of 1440p ultra at 144hz. You should try lowering a few settings, medium looks almost as good as ultra.

For comparison I get 110 - 140 fps on medium (mixed with high) 1080p with a rx 480
 
#4 ·
OP, have you considered looking at the Resource monitor CPU tab during gaming? Play 10-15 minutes, where you notice "dip" and check alt+tab to see how much of the CPU is loaded. Take a screenshot right-away and post it here.

Most likely the problem is the GPU - you are GPU bottlenecked.
You need SLI for 1440p/144fps.
Hell, even on older games like Crysis 3, the 980 Ti had problems with 1080p/120fps.
 
  • Rep+
Reactions: Nisco3000
#5 ·
Eterz and Ku4eto, I mentioned specifically that reducing settings and resolution barely improves the framerate, hence my conclusion that the CPU is the bottleneck.

I plan to get a second GPU, but not yet. a CPU upgrade could be a low cost way to improve my framerate if the current one just can't handle going above 60fps.
 
#6 ·
Quote:
Originally Posted by welshmouse View Post

I plan to get a second GPU, but not yet. a CPU upgrade could be a low cost way to improve my framerate if the current one just can't handle going above 60fps.
This sounds like a good plan. You could sell your current CPU at a small loss and grab a i5/i7 k which should easily OC to 4.6ghz (depending on your cooling and mobo of course). The extra 1ghz would be worth the cost IMO. Another option is to wait and see what Kabylake delivers.
 
#8 ·
Quote:
Originally Posted by welshmouse View Post

Eterz and Ku4eto, I mentioned specifically that reducing settings and resolution barely improves the framerate, hence my conclusion that the CPU is the bottleneck.

I plan to get a second GPU, but not yet. a CPU upgrade could be a low cost way to improve my framerate if the current one just can't handle going above 60fps.
Yeah this is a common issue on BF1, especially on Intel/Nvidia configurations. It mostly seems to happen either when there are a lot of simultaneous online players or when the physics engine is in full swing. Reducing visual quality won't make much difference because the Physics will still be there.

As an educated guess I would say that this version of Frostbite is heavily geared towards AMD tech and doesn't use PhysX or anything so all of the physics calculations are being chucked at the CPU and in the more demanding sections are sending things into overdrive. If you look at the BF1 forum and reddit, most complaints seem to be coming from Nvidia users.
 
#9 ·
Quote:
Originally Posted by welshmouse View Post

Eterz and Ku4eto, I mentioned specifically that reducing settings and resolution barely improves the framerate, hence my conclusion that the CPU is the bottleneck.

I plan to get a second GPU, but not yet. a CPU upgrade could be a low cost way to improve my framerate if the current one just can't handle going above 60fps.
Just check the CPU usage, provide us with a screenshot.
 
#10 ·
Quote:
Originally Posted by ku4eto View Post

OP, have you considered looking at the Resource monitor CPU tab during gaming? Play 10-15 minutes, where you notice "dip" and check alt+tab to see how much of the CPU is loaded. Take a screenshot right-away and post it here.

Most likely the problem is the GPU - you are GPU bottlenecked.
You need SLI for 1440p/144fps.
Hell, even on older games like Crysis 3, the 980 Ti had problems with 1080p/120fps.
As he is talking about "CPU bottleneck", what actually is a CPU bottleneck?

What i think a CPU bottleneck is, that the CPU can't compute the amount of data that it is getting in a short time. So it only computes the amount of data that it can, and send that out before it runs out of time that it maximum can take to compute.

So the GPU sends out 120FPS but the CPU is not fast enough to compute all that FPS, the CPU is only capable to send out 80 FPS max.
The CPU communicates whit the GPU that it only can handel 80FPS, so the GPU start sending out only 80 FPS each time to the CPU.

So let's say the GPU load is around 50% to compute those 80FPS, it can compute more but the the CPU can't handel it. And it already communicated whit the CPU that it only needs to make 80 frames per second.

Now what is the load on the CPU?

From my experience whit a i7 6 cores, when the load on a CPU gets more then 60% load. You will start to see performance degradation in games, stuff like less stable frames, more jitter from frame to frame, less response times.

So what i think is happening is that the CPU has more then 60% load on it and the GPU has like 50%-60% load. What i think is that higher load on a GPU can be fine, something like 70%-80% is total fine. But that amount of load on a CPU can tickle down the performance a lot.

What i would like to see is a screenshot of the CPU load graph (taskmanager) and a screenshot from the GPU load graph (afterburner,evga precision x)

PS: Feel free to correct me on this, because i am just thinking this out loud
biggrin.gif
 
#11 ·
Quote:
Originally Posted by Nisco3000 View Post

As he is talking about "CPU bottleneck", what actually is a CPU bottleneck?

What i think a CPU bottleneck is, that the CPU can't compute the amount of data that it is getting in a short time. So it only computes the amount of data that it can, and send that out before it runs out of time that it maximum can take to compute.

So the GPU sends out 120FPS but the CPU is not fast enough to compute all that FPS, the CPU is only capable to send out 80 FPS max.
The CPU communicates whit the GPU that it only can handel 80FPS, so the GPU start sending out only 80 FPS each time to the CPU.

So let's say the GPU load is around 50% to compute those 80FPS, it can compute more but the the CPU can't handel it. And it already communicated whit the CPU that it only needs to make 80 frames per second.

Now what is the load on the CPU?

From my experience whit a i7 6 cores, when the load on a CPU gets more then 60% load. You will start to see performance degradation in games, stuff like less stable frames, more jitter from frame to frame, less response times.

So what i think is happening is that the CPU has more then 60% load on it and the GPU has like 50%-60% load. What i think is that higher load on a GPU can be fine, something like 70%-80% is total fine. But that amount of load on a CPU can tickle down the performance a lot.

What i would like to see is a screenshot of the CPU load graph (taskmanager) and a screenshot from the GPU load graph (afterburner,evga precision x)

PS: Feel free to correct me on this, because i am just thinking this out loud
biggrin.gif
I know that the CPU is being hammered because I have looked at Task Manager myself using windowed mode/ALT+TAB. I'm not sure about the GPU utilisation because I've been using the Steam Overlay but the 100% spikes on the CPU seem like a dead giveaway to me.

It's made additionally difficult by the fact that none of the overlay tools (FRAPS, Afterburner etc) work with DX12 so you can only get a frame counter on DX11 unless you resort to using the game's built-in and woefully bad console.
 
#12 ·
Quote:
Originally Posted by bloodr0se View Post

I know that the CPU is being hammered because I have looked at Task Manager myself using windowed mode/ALT+TAB. I'm not sure about the GPU utilisation because I've been using the Steam Overlay but the 100% spikes on the CPU seem like a dead giveaway to me.

It's made additionally difficult by the fact that none of the overlay tools (FRAPS, Afterburner etc) work with DX12 so you can only get a frame counter on DX11 unless you resort to using the game's built-in and woefully bad console.
Battlefield 1 should have as well inbuild frame counter. ALso, it is possible that it has built in performacne monitor. If it doesn't, download GPU-Z. Also, use the Resource Monitor, not the Task Manager to monitor the CPU load. Please put a screenshot of both (GPu-Z GPU LOAD and Resrource Monitor CPU tab).
 
#13 ·
Quote:
Originally Posted by ku4eto View Post

Battlefield 1 should have as well inbuild frame counter. ALso, it is possible that it has built in performacne monitor. If it doesn't, download GPU-Z. Also, use the Resource Monitor, not the Task Manager to monitor the CPU load. Please put a screenshot of both (GPu-Z GPU LOAD and Resrource Monitor CPU tab).
It does have one but it's crap. It can only be enabled through console commands and ceases to be drawn at the next save point/respawn.
 
#16 ·
Quote:
Originally Posted by bloodr0se View Post

They shouldn't do. No game should need an i7 to function, that's ridiculous.
there is the trend, DICE and latest games, they are programmed to get more performance from more cores and threads, using less would result in worse performance because probably some of the thread have some work to do in game and they have to wait with less cores, in BF4 an i3 or dual core wasnt enough on DX11

but DX12 should help with the bottlenecks once the game gets working properly on DX12
 
#17 ·
Quote:
Originally Posted by bloodr0se View Post

They shouldn't do. No game should need an i7 to function, that's ridiculous.
Right... except it doesn't need that to function. It functions fine with 4 cores.

But you're talking about 1440p at 144hz here, not just 'functioning'
 
#19 ·
Quote:
Originally Posted by PontiacGTX View Post

there is the trend, DICE and latest games, they are programmed to get more performance from more cores and threads, using less would result in worse performance because probably some of the thread have some work to do in game and they have to wait with less cores, in BF4 an i3 or dual core wasnt enough on DX11

but DX12 should help with the bottlenecks once the game gets working properly on DX12
That would make sense if this game wasn't essentially just an extension of SW: Battlefront. However, it is and that game had no performance problems at all.
 
#20 ·
Quote:
Originally Posted by eatthermalpaste View Post

Right... except it doesn't need that to function. It functions fine with 4 cores.

But you're talking about 1440p at 144hz here, not just 'functioning'
I run it on a 60Hz monitor and it has the same problem.
 
#22 ·
Quote:
Originally Posted by PontiacGTX View Post

Probably it doesnt use same code, it doesnt have same graphics engine version or driver dont handle the same tasks at the same way
Of course it does, they're both Frostbite 3. In fact some of BF1's multiplayer maps look like they were pasted straight from Star Wars.
 
#23 ·
Quote:
Originally Posted by welshmouse View Post

Hey Guys,

Google has failed me, so I'm hoping someone here can help.

I have a fairly beefy system running a GTX 1080 and a core i5 6500 (non k), 16gb Ram.

When playing BF1 at 1440p ultra settings, I get around 70 fps with considerable dips during chaotic firefights. as I have a 144hz monitor, I really want to get more out of it, but I found that even dropping down to 1080p high barely sees an improvement, with an increase to 80 or so frames per second.

Smells a lot like a CPU bottleneck .

This build was the first time in my life I didn't go with a K series CPU. Kicking myself a little, and trying to figure out if this really is a CPU bottleneck.

Has anyone come across any specific high frame rate testing for BF1 that compares different CPUs and clock speeds?

I'm considering picking up a 6600k so I can get 4.2ghz + which I am hoping should get me a decent improvement on my average and maximum frame rate.

I will at some point get a second 1080, but wasn't planning to do that until early next year at best.

TL:DR, will my 1080 see gains from an increased CPU clock to allow for better frame rates beyond 60fps?
Could very well be a CPU bottleneck. Look at these benchmarks for 1440p:



They are getting a min fps higher than your average. So since they can get much better fps, it does not appear to be an engine or server side bottleneck. Most likely it is a bottleneck of having both less processing threads, since the 5960X is only a 3.5GHz Haswell

What is your RAM speed and timings? Simply having 16GB of RAM does not mean it is any decent. Perhaps some of the bottleneck, especially the min frame drops, could be related to bad memory speed.

I would also like to point out that Battlefield 1 has a minimum spec of a newish generation 4 core processor, with a recommended spec of an 8 thread intel processor. So that also strongly points to the framerate issue being a simple lack of processing threads on your i5-6500.
 
#24 ·
If anyone is interested, I did a little playing around with this last night and earlier today and, wow, what a piece of work.

Firstly one of my friends informed me that he is running the game perfectly at 4k and his CPU is only hitting 44% at peak. He has a lightly OC'd 6700k and 2x1080's.

Thinking that the GPU must be responsible for achieving high performance at that ridiculous resolution, I experimented with my first BCLK OC on my own machine to see if I could increase performance. I'm not much of a tweaker (I have neither the time nor the patience) so I followed a tutorial and just tried to replicate what others had done without pushing anything too far.

My final OC settings for the night were 4.29GHz (at a BCLK of 130 and 1.325v) and a 2600MHz DDR4 clock.

I then ran P95 for an hour at 1443k and everything looked stable. The package temp peaked at 62 and had an average of 55.

This morning I tried BF1 and also enabled Afterburner so that I could monitor the CPU and GPU usage. The GPU usage was high however I'm putting that down to the fact that I didn't use a frame limiter.

In a 64 player Operations match the CPU was almost at a constant 100% usage on all cores and there were some frame rate dips but nothing like what I had before doing the OC (e.g. very occasional lows of 30 rather than frequent dips going as low as 18-20 in some extreme cases). High fps was around 90 and the average was around 55-60.

More oddly, in a standard Domination match, the CPU usage was still averaging nearly 100% however the frame rate dips weren't really present at all. Minimum of 64 fps and average of around 90 fps. There was one point where the frame rate dropped to the mid-30's for a second or two but that could have just been an anomaly.

After 1 hour of BF1, HWMonitor reported a peak package temp of 67 on the CPU and an average of 59 so either that was caused by the system temp being increased by the GPU or the game is stressing the CPU as hard, if not harder, than P95.

TLDR:

This game is an i5 CPU's worst nightmare. The minimum recommended CPU for BF1 is a 6600k but I would be surprised if you could even achieve stable figures on that tbh. After a good few years of i5 being the sweet spot for gaming, it looks as though we might be starting to see the i7 range as the go to for PC gamers and that's a big shame. Either this game is taking some serious advantage of HT technology or it's in need of some serious further optimization by EA.

Furthermore, most of the benchmark results out there are bollocks because the in-game BM tool produced excellent results in my case and performance stability seems to be totally dependent upon which game mode you play in. If you stick to the smaller maps and player counts (TDM/Domination) then you are unlikely to ever really see performance issues and EA sneakily didn't allow access to the larger game modes in the pre-release Origin Access demo.
 
#25 ·
There's nothing beefy about that CPU of yours. I'm sorry but for BF1 you need i7 k series or any Hexa Core CPU.
 
#26 ·
Quote:
Originally Posted by bloodr0se View Post

Of course it does, they're both Frostbite 3. In fact some of BF1's multiplayer maps look like they were pasted straight from Star Wars.
Absolutely.
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top