But it does push the GPU to 100% utilization even with the internal framerate "locked" to 60FPS. (Although my screenshot shows 99%, there were instances during that scene that hit 100%).
So this particular example doesn't support your argument that a game with its own framerate limiter isn't going to max out the GPU.
You are wrong about this. It is literally impossible that a GPU is locked to 60 and also constantly maxed out. Check again.
Now in a situation where you get 400FPS at 100% utilization but you cap it at 240FPS, of course that's going to drop the utilization massively but to suggest a framecap isn't going to max out the GPU isn't correct. It depends on various factors.
Anytime a GPU hits a frame cap in a game, the GPU utilization will be less than 100% regardless of what the frame cap is.
Of course because Valorant and I assume Minecraft are more CPU demanding games. And yes I agree that the CPU becomes the bottleneck in that scenario.
You previously stated your GPU should always be running close to 100% as possible except for Vsync or another FPS limiter. My point with Valorant was that's never going to happen.
No, I said any time a
modern and demanding game is being run without a frame cap, it will max out the GPU, even a 4090. Again, Valorant is not a demanding game. Bringing up Valorant does not serve your argument at all. It's totally irrelevant here.
But why I suggested to lower the utilization of the GPU to the OP was so if you are running a game at 80%, you can still have enough headroom to where something happens on screen that pushes the GPU utilization higher while it still maintains the framerate.
OK now I'm starting to see where you are getting confused. Sure, if you cap the FPS to a number that makes your GPU average at 80%, then you will have some headroom to even out dips, but you will also have a lower frame rate and thus higher latency.
The OPs FPS dips from like 145 to 135, so yea if he capped the FPS to 135 he would no longer have "dips" because instead he would just have lower overall performance.
The higher the frame rate, the lower the latency. I play competitive games with vsync and fps caps off so that I can get the best/lowest possible latency, as is the correct way to do things. The goal is to make latency as low as possible, and that means making the frame rate as high as possible. Avatar isn't a competitive game, but it's clear the OP is trying to play at a high frame rate with low latency anyway simply because it controls better.
Next, anti-lag/Nvidia Reflex are very nuanced things that are totally game specific and built into the games that support them (at least for Nvidia). It's not relevant to this discussion. To talk about how those things impact frame consistency, we would need to go into a deep discussion with many game and hardware specific variables and they would all ideally be in relation to VRR displays. There are some very specific situations where a frame cap can improve the user experience for high frame rate competitive gaming, but that has to do with how the GPU and the monitor handle the frame buffer and displaying new frames. Like for example, there are times where lowering the frame cap to be just below (like one hz/ 1 fps below) the G-sync/Freesync/VRR upper limit, it may reduce tearing and smooth out the timing between the frame buffer and the screen, but this won't improve latency. Running the game at a frame rate above the screens refresh rate and just allowing tearing to happen will reduce latency more than anything else.
More than likely, what Andrew said is correct. The Avatar game dips slightly because it's loading chunks of the map. It just has to do with how the game loads information and nothing to do with the GPU or any user settings at all. The OP just wanted to make sure it wasn't an issue with his setup, and it isn't.
When your FPS is well above 60 and you have a VRR display, with a demanding game like Avatar you would always want to be maxing out the GPU and thus also getting the highest frame rate possible, and lowest latency possible. There is no downside to a variable frame rate with a VRR display when running a game like Avatar on a 4090. The issue with variable frame rates is they cause frame tearing, and VRR fixes that.