Overclock.net banner

1 - 20 of 28 Posts

·
Registered
Joined
·
3,607 Posts
Discussion Starter #1
First of all, sorry for another low-GPU usage thread, but the recent ones don't seem to help much, and my question/situation is a bit different, so I figured I would take a shot at a new thread. Any help would be much appreciated, thanks in advance.

My 680 SLI usage does not like staying up where it should. In many games, even though the usual suspects are not present (CPU cores are not maxing out, RAM isn't maxing, VRAM isn't, etc.), my framerate will go below the framerate limit, which is usually set at 120, yet usage will not climb up to keep the framerate at 120. If anything it goes down for some reason and that seems to be the reason the framerate drops below 120, not because of a max load on the core or anything else.

Although all too common for me, the example I'll use now, just because it's what I'm currently playing, is FEAR 3, and the AA reference in the thread's title is because a steady 120fps happens much more often with FXAA enabled as opposed to 4x AA enabled. And no, it's not because 4x AA is more demanding than FXAA that brings down the framerate. GPU usage is between 50-70% in both scenarios, and there's a drastic difference in the framerate, as you can see in the pics below. Even though usage is also no where near 99%, 4x AA has a much harder time staying at 120fps.

FXAA enabled:
700

4x AA enabled:
700

Why does AA steal my fps? Memory isn't maxed either. For BF3, I've heard this low usage is common in multiplayer, the network bandwidth being the bottleneck, which makes sense. But this is singleplayer in FEAR 3. Is it because it's a Steam-activated game that the network is bottlenecking this game too? A good amount of my games are either downloaded through Steam or activated through it. Would the network affect these games the way people say BF3 multiplayer is affected?

Just frustrating since these new 680s, though great, are not giving me my full money's worth. It's not specifically their fault, but I just feel sort of cheated since I don't see 99% usage often. Again, any help/info is greatly appreciated
redface.gif
 

·
Registered
Joined
·
3,607 Posts
Discussion Starter #3
I always thought it was mostly the VRAM that was mostly affected. But no not really much of a difference on the CPU, no cores maxing out.
 

·
Banned
Joined
·
3,190 Posts
unhook internet.is the game playing?no?then your internet is still your bottleneck.find a demanding game that work without internet .retest.
 

·
Registered
Joined
·
3,607 Posts
Discussion Starter #5
I'll test as many games I can find on my PC that don't require the internet. So every Steam-related game will be like this? Holding back my 680s?
 

·
Registered
Joined
·
2,439 Posts
AA or any other GPU effect will increase usage/decrease FPS.

Have you checked if GPU usage goes up when adding AA?

Only time VRAM will limit your system is if it goes over what you have. AKA as long as its under 2gb of VRAM, you're never going to tell the difference.

Also do you have vsynch on? Looks like your FPS is capping.

If you GPU can handle 180 FPS, and you're limiting to 120 (what it looks like), obviously you're never going to hit 100% usage. Its like having a Ferrari and driving it locally.

FXAA is easier on the system, so again it makes sense 4x AA will make your FPS lower.

Good luck. You should really read up more on graphic options and how GPU's work.
 

·
Iconoclast
Joined
·
30,664 Posts
4x AA is more demanding than FXAA in many titles and low GPU usage means there is a bottleneck somewhere, but it doesn't necessarily reveal where.

The GTX 680 has an abundance of shader power and FXAA is shader based. Standard MSAA is limited by your raw fill-rate and memory bandwidth, both of which are much less impressive on the 680. In general, shader power is increasing exponentially, while fill rate has become pretty stagnant.

FXAA may also be less hungry for PCI-E/SLI bandwidth as there would be less to synchronize between the cards.
 

·
Registered
Joined
·
3,607 Posts
Discussion Starter #8
Quote:
Originally Posted by evilferret View Post

AA or any other GPU effect will increase usage/decrease FPS.
Have you checked if GPU usage goes up when adding AA?
Only time VRAM will limit your system is if it goes over what you have. AKA as long as its under 2gb of VRAM, you're never going to tell the difference.
Also do you have vsynch on? Looks like your FPS is capping.
If you GPU can handle 180 FPS, and you're limiting to 120 (what it looks like), obviously you're never going to hit 100% usage. Its like having a Ferrari and driving it locally.
FXAA is easier on the system, so again it makes sense 4x AA will make your FPS lower.
Good luck. You should really read up more on graphic options and how GPU's work.
Thanks for the suggestions. But I'm already aware of all those issues. I use Precision's framerate limiter, which isn't v-sync, and I stated in the original post that it's not just that 4x AA is more demanding, because in the second picture, you can see that when the framerate drops below 120fps, usage isn't increasing to meet demands and attempt to keep the framerate at the max 120. Usage just drops as the framerate does.

I appreciate the help. I'm familiar with all the basics though. Wouldn't have v-sync on for an issue like this. As you can see from the low usage in both pics, there should be no problem keeping the framerate at 120, it's just that usage isn't as 'up there' as it should be.
 

·
Registered
Joined
·
3,607 Posts
Discussion Starter #9
Quote:
Originally Posted by Blameless View Post

4x AA is more demanding than FXAA in many titles and low GPU usage means there is a bottleneck somewhere, but it doesn't necessarily reveal where.
The GTX 680 has an abundance of shader power and FXAA is shader based. Standard MSAA is limited by your raw fill-rate and memory bandwidth, both of which are much less impressive on the 680. In general, shader power is increasing exponentially, while fill rate has become pretty stagnant.
FXAA may also be less hungry for PCI-E/SLI bandwidth as there would be less to synchronize between the cards.
So the low framerate is not caused by the load on the core going down...the load on the core is going down because the framerate is going down because memory bandwidth is the bottleneck? Less than half the VRAM is being used, so could this still be the issue if only half is being used?
 

·
Registered
Joined
·
2,439 Posts
Quote:
Originally Posted by dph314 View Post

Thanks for the suggestions. But I'm already aware of all those issues. I use Precision's framerate limiter, which isn't v-sync, and I stated in the original post that it's not just that 4x AA is more demanding, because in the second picture, you can see that when the framerate drops below 120fps, usage isn't increasing to meet demands and attempt to keep the framerate at the max 120. Usage just drops as the framerate does.
I appreciate the help. I'm familiar with all the basics though. Wouldn't have v-sync on for an issue like this. As you can see from the low usage in both pics, there should be no problem keeping the framerate at 120, it's just that usage isn't as 'up there' as it should be.
Framerate limiter, limits your FPS. Just try it off to see if you GPU usage goes up.

Limiting your FPS in anyway doesn't really help find any problems with your setup. Its like trying to find an engine problem by driving 5mph around the block.

Again 4x AA is pretty intensive on certain games. Did you set up Precision to read the right amount of VRAM? I've ran into a few people who didn't set it up right and wasn't getting the right VRAM readings.

Not sure what you looking for? It kinda looks right to me. 4x AA has higher GPU usage/lower FPS than FXAA.

Only thing that looks weird is GPU2 dropping off with 4x AA but you might have alt tabbed out or something.

Are those graphs from a benchmark? If not do the same benchmark for both settings and see what it looks like. If its from ingame, you're comparing oranges to apples.
 

·
Iconoclast
Joined
·
30,664 Posts
Quote:
Originally Posted by dph314 View Post

So the low framerate is not caused by the load on the core going down...the load on the core is going down because the framerate is going down because memory bandwidth is the bottleneck? Less than half the VRAM is being used, so could this still be the issue if only half is being used?
Could be fill rate, could be memory bandwidth, could be PCI-E bandwith, could be a bizarre shift in driver overhead.

Also, amount of memory in use doesn't necessarily have anything to do with a memory bandwidth bottleneck. You can easily be close to running out of VRAM in totaly shader limited scenarios, or you can be memory bandwidth limited while only using 30MiB of it.

I would try to find a reproducible situation in FEAR and disable any frame rate limiters to get a better idea of what's really going on. Being able to see CPU utilization at the same time might help as well.

Also, what resolution are you running the game at?
 
  • Rep+
Reactions: dph314

·
Registered
Joined
·
3,607 Posts
Discussion Starter #13
Quote:
Originally Posted by evilferret View Post

Framerate limiter, limits your FPS. Just try it off to see if you GPU usage goes up.
Limiting your FPS in anyway doesn't really help find any problems with your setup. Its like trying to find an engine problem by driving 5mph around the block.
Again 4x AA is pretty intensive on certain games. Did you set up Precision to read the right amount of VRAM? I've ran into a few people who didn't set it up right and wasn't getting the right VRAM readings.
Not sure what you looking for? It kinda looks right to me. 4x AA has higher GPU usage/lower FPS than FXAA.
Only thing that looks weird is GPU2 dropping off with 4x AA but you might have alt tabbed out or something.
Are those graphs from a benchmark? If not do the same benchmark for both settings and see what it looks like. If its from ingame, you're comparing oranges to apples.
Yeah I kept the 120fps-limit on because it didn't really seem like it would matter since, even though 4x AA is more demanding, usage has way more than enough headroom to go up and put out 120fps. I don't see how usage can be at 70% when the framerate is at anything less than 120, let alone all the way down in the 60s. Yeah it's not a benchmark but its from the same spot in the same area, but still, it's clear usage has room to go up when at sub-120 framerates, yet it doesn't. Sudden drops are from loading or alt+tabbing out.

Um...can someone tell me what this means? I may (think) I know the basics, but as far as the SLI AA section of the Nvidia Control Panel goes, I never really use it since I did once a while ago and it ended up being super-demanding. I just tried it now, on a wim, and look at what happened to the framerate...

SLI 8x enabled in NVCP:
700

Could someone explain how the SLI AA setting in the NVCP fixed the framerate? Not only are the GPUs now putting out a constant 120fps, but they're doing so at a damn low core clock. All of the sudden usage, even though there's barely any load (680s downclocked themselves to 700mhz core clocks because of light load), is where it needs to be to be able to put out a steady 120fps? How did SLI 8x fix this and also lighten the load?
 

·
Registered
Joined
·
3,607 Posts
Discussion Starter #14
Quote:
Originally Posted by Blameless View Post

Could be fill rate, could be memory bandwidth, could be PCI-E bandwith, could be a bizarre shift in driver overhead.
Also, amount of memory in use doesn't necessarily have anything to do with a memory bandwidth bottleneck. You can easily be close to running out of VRAM in totaly shader limited scenarios, or you can be memory bandwidth limited while only using 30MiB of it.
I would try to find a reproducible situation in FEAR and disable any frame rate limiters to get a better idea of what's really going on. Being able to see CPU utilization at the same time might help as well.
Also, what resolution are you running the game at?
1080p. Yeah I think I see what you mean about the memory. Though, CPU cores remain pretty steady, none near a full load.

Quote:
Originally Posted by JTHMfreak View Post

Whats slots you got those cards in? On my MSI mobo if I use 1&3 I don't get anywhere near full usage, but slots 1&2 is a much different story.
Mine just has 2 slots. I do wish it had a third though. Someday
redface.gif
 

·
Registered
Joined
·
2,439 Posts
IF you're talking about the SLI AA thingie than it makes sense.

AA kills our GPU's. Certain games work better when I force one GPU to do all the AA and the other to play the game.

1 680 should handle the game fine, and now you're using the 2nd to concentrate on AA and other effects.

This mode only works for me with older titles. Newer titles I just use normal SLI mode.
 

·
Registered
Joined
·
3,607 Posts
Discussion Starter #16
Quote:
Originally Posted by evilferret View Post

IF you're talking about the SLI AA thingie than it makes sense.
AA kills our GPU's. Certain games work better when I force one GPU to do all the AA and the other to play the game.
1 680 should handle the game fine, and now you're using the 2nd to concentrate on AA and other effects.
This mode only works for me with older titles. Newer titles I just use normal SLI mode.
So SLI AA in NVCP separates the functions of each GPU? If one does the game and one does the AA, they still have an equal usage % like in the above pic?

And how come normal AA with the in-game settings, and also normal AA in the NVCP which I tried earlier, cripples the framerate as well as keeps usage down, yet SLI AA does the complete opposite? It's a newer game, so normal AA wasn't working because of something with SLI? The memory bandwidth thing again? But if SLI AA separates them, then there's no bottleneck between the two sharing information, and everything is fine?
 

·
Banned
Joined
·
3,190 Posts
are you hungry?no ?why would you go at an all you can heat buffet if you re on diet.same thing for gpu.aa used to be usefull.now a day fxaa or amd mlaa will do .and they aa transparency !msaa doesnt.adaptative aa rarelly do.and ssaa is a nono .just use fxaa (nvidia)or.mlaa (amd) if you can notice the difference in a pvp fight vs 64 other maniac and play the game.send us the tube so we can count how many time you died lol
 

·
Iconoclast
Joined
·
30,664 Posts
Quote:
Originally Posted by dph314 View Post

Could someone explain how the SLI AA setting in the NVCP fixed the framerate? Not only are the GPUs now putting out a constant 120fps, but they're doing so at a damn low core clock. All of the sudden usage, even though there's barely any load (680s downclocked themselves to 700mhz core clocks because of light load), is where it needs to be to be able to put out a steady 120fps? How did SLI 8x fix this and also lighten the load?
http://www.nvidia.com/object/slizone_sliAA_howto1.html

"SLI Antialiasing is a new standalone SLI rendering mode that offers up to double the antialiasing performance by splitting the antialiasing workload between the two graphics cards."

This, combined with the performance results you are seeing, implies to me that PCI-E bandwidth, or driver overhead, eaten up by coherency/synchronization was likely your main issue.

If each card works on AA independently of the other, they don't have to send as much data back and forth.
 

·
Registered
Joined
·
3,607 Posts
Discussion Starter #19
Quote:
Originally Posted by drbaltazar View Post

are you hungry?no ?why would you go at an all you can heat buffet if you re on diet.same thing for gpu.aa used to be usefull.now a day fxaa or amd mlaa will do .and they aa transparency !msaa doesnt.adaptative aa rarelly do.and ssaa is a nono .just use fxaa (nvidia)or.mlaa (amd) if you can notice the difference in a pvp fight vs 64 other maniac and play the game.send us the tube so we can count how many time you died lol
I'm not sure I completely understand the all-you-can-eat buffet analogy. I think I know what you mean though. I'm just trying to figure out why a simple in-game setting is killing the usage on my $500 cards though. But I'll use FXAA is I have to.

Well I just turned the in-game 4x AA on after starting the game with the SLI 8x enabled in NVCP from last pic. Now, not only did the framerate stay at 120, but usage is also up and staying where it should be. You can see in the pic the spot in the graph where the load increased when in-game 4x AA was enabled...

700

Also, core clock now needs to run at full speed. So this SLI AA thing is a common low-gpu-usage fix? I never heard of it as a treatment before
headscratch.gif
 

·
Registered
Joined
·
3,607 Posts
Discussion Starter #20
Quote:
Originally Posted by Blameless View Post

http://www.nvidia.com/object/slizone_sliAA_howto1.html
"SLI Antialiasing is a new standalone SLI rendering mode that offers up to double the antialiasing performance by splitting the antialiasing workload between the two graphics cards."
This, combined with the performance results you are seeing, implies to me that PCI-E bandwidth, or driver overhead, eaten up by coherency/synchronization was likely your main issue.
If each card works on AA independently of the other, they don't have to send as much data back and forth.
Ah. Makes sense. So, you could say that I'd benefit from a PCIe 3.0 CPU+mb?
 
1 - 20 of 28 Posts
Top