Overclock.net banner
21 - 40 of 53 Posts
How do you figure it does that?

Your screenshot just shows that your GPU doesn't always maintain 60FPS with vsync on because it's not powerful enough to do so with whatever game that is. When your GPU dips below 60, the GPU usage goes up to 99% because it's being fully utilized as it tries and fails to maintain 60fps. If you turn vsync off, the gpu will just be at full utilization all the time.
Vsync isn't on.

And no, vsync off does not show 100% utilization all the time.
 
Vsync isn't on.

And no, vsync off does not show 100% utilization all the time.
So Nvidia has no idea what they are talking about?



Except if its being utilized at 100% then the GPU is the clear bottleneck in your system
No, you have no idea what you're talking about.

The only way a game stays at a solid 60FPS is if the frame rate is capped to 60, either via vsync or some other form of frame cap. Clearly you have a frame cap enabled in the screenshot you provided as there is no other way your frame rate would lock to 60 or lower. If you did not enable this or there is no option for it, then the game you are playing probably caps at 60, which would be very odd but it's possible.

If you turn off vsync and/or whatever other frame cap you have enabled, the GPU usage will be at max all the time assuming you don't have some other bottleneck, but even if you did have a bottleneck it would not lock to 60.


All you have proven to us that you don't even understand what's going on with your own gaming PC as far as how anything related to this topic works.

When gaming, a GPU is supposed to be the bottleneck. The goal is to have a system powerful enough to max out the GPU, which isn't hard to do at all with any modern demanding game and a decent CPU. GPU running at max is normal expected behavior and the fact that you don't understand this very simple and basic aspect of PC gaming is a big red flag as to your level of comprehension on this topic.

But I'd love to see you try to defend something everyone here is 100% sure you are misunderstanding some more if you want to. Why don't you post some screenshots with the frame cap turned off?
 
  • Rep+
Reactions: Dragonsyph
The only way a game stays at a solid 60FPS is if the frame rate is capped to 60, either via vsync or some other form of frame cap. Clearly you have a frame cap enabled in the screenshot you provided as there is no other way your frame rate would lock to 60 or lower. If you did not enable this or there is no option for it, then the game you are playing probably caps at 60, which would be very odd but it's possible.
There is no external framerate limiter I'm using. There's nothing odd about it as its a fighting game since they all run 60FPS (except in situations where you can use 3rd party applications to "unlock" the framerate higher like 120FPS)

If you turn off vsync and/or whatever other frame cap you have enabled, the GPU usage will be at max all the time assuming you don't have some other bottleneck.
If my GPU usage is 30-45% and the CPU is hardly being utilized by a single core what other bottleneck is there? Games like Valorant don't even utilize 100% of a 4090.
 
There is no external framerate limiter I'm using. There's nothing odd about it as its a fighting game since they all run 60FPS (except in situations where you can use 3rd party applications to "unlock" the framerate higher like 120FPS)



If my GPU usage is 30-45% and the CPU is hardly being utilized by a single core what other bottleneck is there? Games like Valorant don't even utilize 100% of a 4090.
What game is it? If it uses a 60fps cap, performance would be the same as if vsync were enabled so it changes nothing as far the point I'm trying to get you to understand.

Valorant is a very simple game. It's no surprise that a 4090 could easily run it without maxing out the GPU utilization.

Try any modern and demanding game without an Fps cap and your gpu will be maxed out all the time, as it should be.

I any case, using a game with a built in 60fps cap to try to prove your argument that GPUs are not supposed to be maxed out under normal gaming is just so very wrong. Yes, when a frame cap is enabled a GPU should not be at max unless it's not able to reach the fps cap. The OP is clearly not using a cap, so his gpu utilization being at or near max is how it is supposed to be.
 
What game is it? If it uses a 60fps cap, performance would be the same as if vsync were enabled so it changes nothing as far the point I'm trying to get you to understand.
Street Fighter 6. The game does have an option to enable Vsync if you want, but I have it disabled. And also while SF6 does a great job at internally capping it at 60FPS, other fighting games can have poor implementations of them where you will see 1% lows down to 55FPS or even your main FPS hovering around 58-59FPS (with hardly any GPU or CPU utilization). All these new modern fighting games are not demanding on my system with the exception of Tekken 8 at 4K (if I use a graphics mod to increase the rendering) but that's a poorly optimized game.

Valorant is a very simple game. It's no surprise that a 4090 could easily run it without maxing out the GPU utilization.
Weren't you just arguing though that having your GPU run at 100% utilization is a good thing?

Try any modern and demanding game without an Fps cap and your gpu will be maxed out all the time, as it should be.
Depends on the game, settings, resolution, etc. And of course that should be expected depending on how demanding the game is on your GPU. The issue though is when you have your GPU running as fast it can, the issues that come along with it. You are going to have framerate dips, lower clock speed, higher input latency and so on. Take for example you are in an empty world and there's nothing happening on screen besides the environment moving around but your GPU is 100%, then all of a sudden theres an explosion in front of you that taxes the GPU, you are going to see a huge drop in FPS. That's why 100% utilization is not a good thing if you are trying to maintain a particular FPS.

I any case, using a game with a built in 60fps cap to try to prove your argument that GPUs are not supposed to be maxed out under normal gaming is just so very wrong.
The thing is even with the internal framerate limiter of 60FPS, the utilization before it was already high (70-80%) during normal gameplay but as soon as I took a specific action in the game (a super move that transitions into an ingame cutscene) and the utilization hits 100%, the framerate dips under 60FPS. That's what the screenshot is showing.
 
Street Fighter 6. The game does have an option to enable Vsync if you want, but I have it disabled. And also while SF6 does a great job at internally capping it at 60FPS, other fighting games can have poor implementations of them where you will see 1% lows down to 55FPS or even your main FPS hovering around 58-59FPS (with hardly any GPU or CPU utilization). All these new modern fighting games are not demanding on my system with the exception of Tekken 8 at 4K (if I use a graphics mod to increase the rendering) but that's a poorly optimized game.



Weren't you just arguing though that having your GPU run at 100% utilization is a good thing?



Depends on the game, settings, resolution, etc. And of course that should be expected depending on how demanding the game is on your GPU. The issue though is when you have your GPU running as fast it can, the issues that come along with it. You are going to have framerate dips, lower clock speed, higher input latency and so on. Take for example you are in an empty world and there's nothing happening on screen besides the environment moving around but your GPU is 100%, then all of a sudden theres an explosion in front of you that taxes the GPU, you are going to see a huge drop in FPS. That's why 100% utilization is not a good thing if you are trying to maintain a particular FPS.


The thing is even with the internal framerate limiter of 60FPS, the utilization before it was already high (70-80%) during normal gameplay but as soon as I took a specific action in the game (a super movie that transitions into an ingame cutscene) and the utilization hits 100%, the framerate dips under 60FPS. That's what the screenshot is showing.
It's like you're trying to miss the point. Rest assured, I and anyone else here knows for absolute certain that you are misinformed here. The more you want to deny that and argue about it, the harder it will be for you to understand the reality of the situation.

So now that I know what game you're talking about, yes indeed street fighter 6 is locked to 60FPS, which makes it totally irrelevant here. Any game with a frame cap is not going to max out the GPU unless the GPU is too slow to hit the cap, as I have already stated.

Again, Valorant is a very simple game so it is expected that a 4090 would not be fully utilized with it simply because a 4090 is so fast that the CPU becomes the bottleneck, but the game will be running at around 400FPS or so at this point so it's also irrelevant. There are times when a game is just not demanding enough to max out a GPU, but you will never see that happen on a modern demanding game, which Valorant is not. You wouldn't see a 4090 maxed out running minecraft either, but I trust you can wrap your head around that.

Avatar is a modern and demanding game, so it should easily max out any GPU, even a 4090.


The last two paragraphs in your last comment are pure nonsense. I don't even know how to address that. You might as well be saying 2+2 equal 1. Your logic is completely unsound and so disjointed, there is no where to start trying to explain where you went wrong. What you are saying would make some sense if we were talking about CPUs, but not with GPUs.

When you do something in your game that causes the FPS to drop below 60, it's because your GPU is not fast enough to maintain 60FPS in that particular instance, and the GPU running at max in that instance is expected to happen when it can no longer hit the frame rate limit. It maxes out because it's running under the frame cap and trying to keep up. Not the other way around, as you seem to think it is.

What GPU do you have?


Think of it this way. If you have a GPU that isn't fast enough to run cp2077 at max settings but you try to do so anyway, what would you expect the GPU utilization to be?

Then, let's say you have a GPU that is more than fast enough to play a game locked to 60FPS. Would the GPU be maxed out if it's more than fast enough to keep up?
 
Someone help me here, can't get through to this guy.
 
If you have a car that is going 60 MPH because that is the speed limit, are you maxing out the engine? No. But if you have a really slow car that isn't able to reach 60 MPH, then yes you would be maxing out the engine by trying to go that fast. (frame cap)

Alternatively, if you have a fast car and no speed limit, and you drive as fast as you can, the engine will be maxed out. (no frame cap)
 
Might be map/section loading. Monitor your SSD usage concurrently to see if there are usage spikes that coincide with your dips. More CPU cores might help with SSD data decompression while running the game.
You have no idea what ur talking about.
Just quoting you guys since you commented on this thread. Maybe you can help explain to RobPKG what he's missing.
 
So now that I know what game you're talking about, yes indeed street fighter 6 is locked to 60FPS, which makes it totally irrelevant here. Any game with a frame cap is not going to max out the GPU unless the GPU is too slow to hit the cap, as I have already stated.
But it does push the GPU to 100% utilization even with the internal framerate "locked" to 60FPS. (Although my screenshot shows 99%, there were instances during that scene that hit 100%).

So this particular example doesn't support your argument that a game with its own framerate limiter isn't going to max out the GPU.

There are plenty of games where having a framerate cap can still push your GPU to 100% utilization. For example, if I have a 240hz monitor and I cap the framerate at 240FPS, the game I'm playing might still be at 99-100% utilization, and if I try to cap the framerate to say 300FPS then that cap is irrelevant because it's not going to hit the cap when the GPU is at 100% utilization.

Now in a situation where you get 400FPS at 100% utilization but you cap it at 240FPS, of course that's going to drop the utilization massively but to suggest a framecap isn't going to max out the GPU isn't correct. It depends on various factors.

Again, Valorant is a very simple game so it is expected that a 4090 would not be fully utilized with it simply because a 4090 is so fast that the CPU becomes the bottleneck, but the game will be running at around 400FPS or so at this point so it's also irrelevant. There are times when a game is just not demanding enough to max out a GPU, but you will never see that happen on a modern demanding game, which Valorant is not. You wouldn't see a 4090 maxed out running minecraft either, but I trust you can wrap your head around that.
Of course because Valorant and I assume Minecraft are more CPU demanding games. And yes I agree that the CPU becomes the bottleneck in that scenario.

You previously stated your GPU should always be running close to 100% as possible except for Vsync or another FPS limiter. My point with Valorant was that's never going to happen.

Now if you had rephrased it to say you want your GPU to be utilized far more than your CPU for gaming, that I agree with and you should avoid CPU bottlenecks by having rendering etc. offloaded to the GPU.

If that is what you meant from the beginning, then there's no disagreement from me. The issue however is hitting 100% GPU utilization.

When you do something in your game that causes the FPS to drop below 60, it's because your GPU is not fast enough to maintain 60FPS in that particular instance, and the GPU running at max in that instance is expected to happen when it can no longer hit the frame rate limit. It maxes out because it's running under the frame cap and trying to keep up. Not the other way around, as you seem to think it is.
I think we are saying the same thing, are we not? If your GPU is 100% utilized and something happens in game that causes the FPS to drop then its simply a matter of the GPU not being fast enough.

But why I suggested to lower the utilization of the GPU to the OP was so if you are running a game at 80%, you can still have enough headroom to where something happens on screen that pushes the GPU utilization higher while it still maintains the framerate.

I personally have no problem lowering certain graphical settings if it means I can maintain a desired framerate with consistent frametimes. I do not like it when FPS goes under my monitor's refresh rate. which might happen with 100% GPU utilization and depending on the situation in game.

What GPU do you have?
RTX4090

Think of it this way. If you have a GPU that isn't fast enough to run cp2077 at max settings but you try to do so anyway, what would you expect the GPU utilization to be?
Near or at 100%

Then, let's say you have a GPU that is more than fast enough to play a game locked to 60FPS. Would the GPU be maxed out if it's more than fast enough to keep up?
Nope the GPU would not be maxed out.

But again the issue is the drawbacks that come when utilization is at 100%. That's where it is undeseriable.
 
If you have a car that is going 60 MPH because that is the speed limit, are you maxing out the engine? No. But if you have a really slow car that isn't able to reach 60 MPH, then yes you would be maxing out the engine by trying to go that fast. (frame cap)

Alternatively, if you have a fast car and no speed limit, and you drive as fast as you can, the engine will be maxed out. (no frame cap)
Yes that is correct. The issue, again, is what happens when the engine is maxed out. That's what this argument revolves around.
 
But it does push the GPU to 100% utilization even with the internal framerate "locked" to 60FPS. (Although my screenshot shows 99%, there were instances during that scene that hit 100%).

So this particular example doesn't support your argument that a game with its own framerate limiter isn't going to max out the GPU.
You are wrong about this. It is literally impossible that a GPU is locked to 60 and also constantly maxed out. Check again.

Now in a situation where you get 400FPS at 100% utilization but you cap it at 240FPS, of course that's going to drop the utilization massively but to suggest a framecap isn't going to max out the GPU isn't correct. It depends on various factors.
Anytime a GPU hits a frame cap in a game, the GPU utilization will be less than 100% regardless of what the frame cap is.

Of course because Valorant and I assume Minecraft are more CPU demanding games. And yes I agree that the CPU becomes the bottleneck in that scenario.

You previously stated your GPU should always be running close to 100% as possible except for Vsync or another FPS limiter. My point with Valorant was that's never going to happen.
No, I said any time a modern and demanding game is being run without a frame cap, it will max out the GPU, even a 4090. Again, Valorant is not a demanding game. Bringing up Valorant does not serve your argument at all. It's totally irrelevant here.

But why I suggested to lower the utilization of the GPU to the OP was so if you are running a game at 80%, you can still have enough headroom to where something happens on screen that pushes the GPU utilization higher while it still maintains the framerate.
OK now I'm starting to see where you are getting confused. Sure, if you cap the FPS to a number that makes your GPU average at 80%, then you will have some headroom to even out dips, but you will also have a lower frame rate and thus higher latency.

The OPs FPS dips from like 145 to 135, so yea if he capped the FPS to 135 he would no longer have "dips" because instead he would just have lower overall performance.

The higher the frame rate, the lower the latency. I play competitive games with vsync and fps caps off so that I can get the best/lowest possible latency, as is the correct way to do things. The goal is to make latency as low as possible, and that means making the frame rate as high as possible. Avatar isn't a competitive game, but it's clear the OP is trying to play at a high frame rate with low latency anyway simply because it controls better.

Next, anti-lag/Nvidia Reflex are very nuanced things that are totally game specific and built into the games that support them (at least for Nvidia). It's not relevant to this discussion. To talk about how those things impact frame consistency, we would need to go into a deep discussion with many game and hardware specific variables and they would all ideally be in relation to VRR displays. There are some very specific situations where a frame cap can improve the user experience for high frame rate competitive gaming, but that has to do with how the GPU and the monitor handle the frame buffer and displaying new frames. Like for example, there are times where lowering the frame cap to be just below (like one hz/ 1 fps below) the G-sync/Freesync/VRR upper limit, it may reduce tearing and smooth out the timing between the frame buffer and the screen, but this won't improve latency. Running the game at a frame rate above the screens refresh rate and just allowing tearing to happen will reduce latency more than anything else.


More than likely, what Andrew said is correct. The Avatar game dips slightly because it's loading chunks of the map. It just has to do with how the game loads information and nothing to do with the GPU or any user settings at all. The OP just wanted to make sure it wasn't an issue with his setup, and it isn't.


When your FPS is well above 60 and you have a VRR display, with a demanding game like Avatar you would always want to be maxing out the GPU and thus also getting the highest frame rate possible, and lowest latency possible. There is no downside to a variable frame rate with a VRR display when running a game like Avatar on a 4090. The issue with variable frame rates is they cause frame tearing, and VRR fixes that.
 
Yes that is correct. The issue, again, is what happens when the engine is maxed out. That's what this argument revolves around.
What? No.

For a game like Valorant, sure the game engine is not designed to run at 400+FPS, as basically no games really are, and virtually no screens could keep up with that anyway. 360hz is pretty much the limit for consumer gaming screens so anything above that is totally irrelevant. There is no reason to design games to run faster than any screen can dispaly frames. That's not what we are talking about here. The OP's FPS is well within a normal refresh rate range for modern displays to handle.

And you don't need to limit the GPU with a frame cap or anything in this Valorant example you keep brining up. It just won't max out the GPU because the game won't run faster than several hundred FPS. That has nothing to do with a frame cap or vsync or anything, it just is what it is. It's irrelevant here.
 
You are wrong about this. It is literally impossible that a GPU is locked to 60 and also constantly maxed out. Check again.
I have checked multiple times.

I don't think this is unusual behavior because I'm running the game at 4K, I have every graphics preset option set to Ultra, I have doubled the internal rendering, and so on. These settings put a heavy load on the GPU for this particular game. It's probably even more compounded that the game isn't well optimized either, which happens all the time with fighting games on PC. The only modern game I've played on PC that was extremely well optimized was Doom Eternal.

Anytime a GPU hits a frame cap in a game, the GPU utilization will be less than 100% regardless of what the frame cap is.
Depends if the GPU can be utilized beyond the frame limiter. You can have a GPU that hits 100% utilization at 60FPS and the internal framecap of the game is 60FPS. It could just simply be a weak graphics card that can't keep up with demanding games beyond 60FPS. And lets say you removed that internal framecap or set it higher, the GPU is still going to be at 100% utilization when its rendering at 60FPS. The only time you don't see 100% utilization with a frame cap is when the GPU can easily hit that FPS framecap.

No, I said any time a modern and demanding game is being run without a frame cap, it will max out the GPU, even a 4090. Again, Valorant is not a demanding game. Bringing up Valorant does not serve your argument at all. It's totally irrelevant here.
Yes but you suggested that your GPU should be utilized at 100% of the time regardless and I'm pointing out there are games that come nowhere close to it. But I think there was a misunderstanding between us and why I agreed that under modern demanding games the GPU should have the lion's share of utilization.

OK now I'm starting to see where you are getting confused. Sure, if you cap the FPS to a number that makes your GPU average at 80%, then you will have some headroom to even out dips, but you will also have a lower frame rate and thus higher latency.
Depends on the goals you are trying to achieve. I normally play at uncapped FPS, but if your goal is to have your game run at 144FPS to match your monitor's referesh rate of 144hz, for example, and if frame limiting it to 144FPS causes your GPU being utilized at 50% then what is the issue here?

In fact it is probably the most desirable option as using a frame limiter like RTSS will give you perfectly consistent frametimes which means a smoother experience (you probably want to cap to the mininum FPS for even more increased smoothness).

The tradeoff is you won't get the absolute lowest latency possible.

The higher the frame rate, the lower the latency. I play competitive games with vsync and fps caps off so that I can get the best/lowest possible latency, as is the correct way to do things.
In terms input responsiveness, yes this is desirable and how I also play. But it's not good if you want consistent framepacing or to eliminate possible screen tearing.

More than likely, what Andrew said is correct. The Avatar game dips slightly because it's loading chunks of the map. It just has to do with how the game loads information and nothing to do with the GPU or any user settings at all. The OP just wanted to make sure it wasn't an issue with his setup, and it isn't.
Yes I suspect that is also the case but its interesting you point out that the OP wants the lowest latency experience with his game, when reducing the load on the GPU in order to generate more frames would achieve this goal, which was my suggestion to the OP. It's already been demonstrated that latency is impacted when GPU hits 100% utilization.

When your FPS is well above 60 and you have a VRR display, with a demanding game like Avatar you would always want to be maxing out the GPU and thus also getting the highest frame rate possible, and lowest latency possible. There is no downside to a variable frame rate with a VRR display when running a game like Avatar on a 4090. The issue with variable frame rates is they cause frame tearing, and VRR fixes that.
Not exactly. There is always going to be input latency penalty with the use of G-Sync/Freesync. There's no free lunch. The use of strobing also adds latency.
 
Just quoting you guys since you commented on this thread. Maybe you can help explain to RobPKG what he's missing.
Going by his posts , he’s a lost cause mate, you explained it perfectly to him. Have a blessed day.
 
What? No.

For a game like Valorant, sure the game engine is not designed to run at 400+FPS, as basically no games really are, and virtually no screens could keep up with that anyway.
That is not true at all. I am not sure what the actual in game engine cap is, but I can easily average 700FPS in that game. With the Source engine the limit was 999FPS (you could go beyond it but weird behavior would occur). I suspect Source 2 is the same way. Rocket League is another game where I am averaging 1100FPS-1200FPS.

And we have screens that refresh at 480hz to 540hz now and higher soon to come.

[/quote]360hz is pretty much the limit for consumer gaming screens so anything above that is totally irrelevant. There is no reason to design games to run faster than any screen can dispaly frames. That's not what we are talking about here. The OP's FPS is well within a normal refresh rate range for modern displays to handle.[/quote]

You seem to be contradicting yourself here. For competitive games, you turn off both vsync and any kind of frame limiter in order to achieve the lowest latency. Having the ability to render more frames than your display can output still provides a level of reduced input latency, which you've already said. There is no reason to limit the internal limit of a game's FPS if the engine can handle it, as I already said above with the provided examples. Even if the display does not have a fast enough refresh rate, there is still the subjective feeling of improved responsiveness.

And you don't need to limit the GPU with a frame cap or anything in this Valorant example you keep brining up. It just won't max out the GPU because the game won't run faster than several hundred FPS. That has nothing to do with a frame cap or vsync or anything, it just is what it is. It's irrelevant here.
You can frame cap Valorant if you want an extremely smooth experience with locked in frametimes but with higher latency. If you look at your 1% lows or even your 0.1% lows, and lets just say for 0.1% its 265FPS, if you cap it at 240FPS because you have a 240hz monitor, you will have a perfect smooth experience with the game, with no fluctuation of frametime or tearing.
 
21 - 40 of 53 Posts