Overclock.net › Forums › Graphics Cards › NVIDIA › SLI GPU Usage Question!
New Posts  All Forums:Forum Nav:

SLI GPU Usage Question!

post #1 of 8
Thread Starter 
121

Question: Is there any reason why my GPU1 is fluctuating between 94-99% usage while GPU2 is fixed on a constant 99%?

This is only happening with the Witcher 2, no other game that I've noticed (meaning games that actually need to use 99% of both GPUs)

I'm not sure if it's reducing my performance or not. I'm getting great performance regardless, just curious now (30-60FPS maxed out, Ubersampling ON -- 60-90 FPS maxed, Ubersampling OFF)

My sig rig is accurate and up to date. My 1100T is running at 3923mhz CPU / 2808mhz NB / 1596mhz RAM.
    
CPUGraphicsRAMHard Drive
4790k EVGA GTX 980 Ripjaws 1866 Intel 730 
CoolingMonitorCase
silent wings ASUS Vga 236h define r5 
  hide details  
Reply
    
CPUGraphicsRAMHard Drive
4790k EVGA GTX 980 Ripjaws 1866 Intel 730 
CoolingMonitorCase
silent wings ASUS Vga 236h define r5 
  hide details  
Reply
post #2 of 8
You should never expect games to always use hardware to 99%-100% (especially with SLI), if this is only happening in the Witcher 2, then consider yourself lucky. thumb.gif
The Leviathan
(20 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700k @ 4.7GHz MSI Z170A Gaming M7 12GB NVIDIA Titan X (Pascal) 32GB G.Skill Ripjaws V (DDR4 3200) 
Hard DriveHard DriveCoolingCooling
2x 1TB Samsung 960 PRO 193TB unRAID Server 3x 140mm Noctua NF-A14 Noctua NH-D15 
OSMonitorKeyboardPower
Windows 10 Pro x64 65" LG 65E6P (4K OLED) Ducky DK9008 Shine 3  Corsair AX860 
CaseMouseAudioAudio
Corsair Obsidian 750D Logitech G502 Proteus Sprectrum Denon X7200WA (Receiver) 2x Klipsch RF-7 (Front Speakers) 
AudioAudioAudioAudio
4x Klipsch RS-62 (Surround Speakers) Klipsch RC-64 (Center Speaker) 4x Klipsch CDT-5800-C II (Atmos Speakers) 2x SVS PB16-Ultra (Subwoofers) 
  hide details  
Reply
The Leviathan
(20 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700k @ 4.7GHz MSI Z170A Gaming M7 12GB NVIDIA Titan X (Pascal) 32GB G.Skill Ripjaws V (DDR4 3200) 
Hard DriveHard DriveCoolingCooling
2x 1TB Samsung 960 PRO 193TB unRAID Server 3x 140mm Noctua NF-A14 Noctua NH-D15 
OSMonitorKeyboardPower
Windows 10 Pro x64 65" LG 65E6P (4K OLED) Ducky DK9008 Shine 3  Corsair AX860 
CaseMouseAudioAudio
Corsair Obsidian 750D Logitech G502 Proteus Sprectrum Denon X7200WA (Receiver) 2x Klipsch RF-7 (Front Speakers) 
AudioAudioAudioAudio
4x Klipsch RS-62 (Surround Speakers) Klipsch RC-64 (Center Speaker) 4x Klipsch CDT-5800-C II (Atmos Speakers) 2x SVS PB16-Ultra (Subwoofers) 
  hide details  
Reply
post #3 of 8
Thread Starter 
Me lucky?

You're the one with the 680 =) Wow those 680s hit 1200+ core clocks?

Also I've found that basically ALL of the many, many games that don't properly utilize SLI (mostly console ports, of course) can be forced to use 99% very easily.

Manage 3D settings -- Antialising Settings -- and set it to SLI 8x and SLI 16x supersampling.

It might not be giving you the raw rendering power of proper SLI usage, but it WILL force 99% usage on both cards and simultaneously increases FPS (often by significant amouts ie 20-30+) and visual quality.
Edited by LesPaulLover - 4/9/12 at 8:34pm
    
CPUGraphicsRAMHard Drive
4790k EVGA GTX 980 Ripjaws 1866 Intel 730 
CoolingMonitorCase
silent wings ASUS Vga 236h define r5 
  hide details  
Reply
    
CPUGraphicsRAMHard Drive
4790k EVGA GTX 980 Ripjaws 1866 Intel 730 
CoolingMonitorCase
silent wings ASUS Vga 236h define r5 
  hide details  
Reply
post #4 of 8
Quote:
Originally Posted by LesPaulLover View Post

Me lucky?
You're the one with the 680 =) Wow those 680s hit 1200+ core clocks?
Also I've found that basically ALL of the many, many games that don't properly utilize SLI (mostly console ports, of course) can be forced to use 99% very easily.
Manage 3D settings -- Antialising Settings -- and set it to SLI 8x and SLI 16x supersampling.
It might not be giving you the raw rendering power of proper SLI usage, but it WILL force 99% usage on both cards and simultaneously increases FPS (often by significant amouts ie 20-30+) and visual quality.

It's funny ... I've been running SLI for years, and never seen one game that couldn't properly use SLI, unless it was too new for the driver to have a SLI profile for it yet. Once there's a SLI profile (or I've made one myself using nVInspector), SLI has always worked perfect for me.

As a matter of fact, for many years now, every game that's come out 'supports SLI', so it's really not a matter of a 'game' having trouble 'utilizing' SLI ... it's a matter of how well the driver uses SLI when it runs that particular game ... which is in turn determined by the presence of a properly configured SLI profile for that game.

The game itself doesn't care or even know that you're running in SLI, it doesn't 'adapt' to it or anything like that. I mean, it has to be coded in such a way that SLI is 'supported' but beyond that, the operation of SLI is up to nVidia and their driver team, not the game.

BTW, your usage looks great, esp. in that game. I can't tell you for sure why one card is doing that dipping thing, I'd just be guessing, but I've seen that kind of thing plenty of times so I wouldn't worry about it.

You should also be aware that turning on SLI AA could very well be showing this big FPS increase because, paradoxically, it's actually turning off AA that you have set in the game altogether. Not always, mind you, but in many cases that will be what happens. 'Forced' AA of any kind is extremely shatter-shot in terms of it actually 'working', going that route can cause AA to not work at all even when AA is turned on the game ... plus forced SLI AA is the least likely to actually 'work' ...

Lastly, paradoxically, turning on SLI AA disables SLI, and basically makes all the rendering happen on one card. IOW, the reason you're seeing 99% all of a sudden with it is because you're running the game on one card. However, in this mode, your GPU usage measurement for both cards in a tool like AB will actually just be the usage of that one card, duplicated into both graphs.

SLI AA has it's uses, but it's primarily only advantageous when a game flat out has no SLI support, because it's either too old, or there's no profile for it in the driver yet.
Edited by brettjv - 4/9/12 at 11:37pm
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
post #5 of 8
Thread Starter 
What I've found is that, like I said...it's always console ports (especially poor ones, of course) that don't properly utilize my SLI. I'm sure you know how terrible Skyrim was when it released. Not just in terms of GPU of course, but on my system, it was **PISS POOR**

I'm getting literally DOUBLE the frames now, than I was getting when Skyrim released -- most of this in the past 3 months or so. When Skyrim released I was getting 30-60 fps (outdoors and in cities) ... (40-120fps indoors) Now I get 60-120 frames in the wild, never dip below 45 even on the infamous FPS-crushing steps of Dragonreach, and i get a constant 120+ fps indoors.

It was a joke that Bethesda released their game in such a condition. I only had 20 hours played by february of this year when i tried it again. It's finally the game it should have been thanks mostly to nvidia. Their next driver offers further increased Skyrim performance as well -- the beauty of SLI I'm finding. When Nvidia releases a driver that offers 5% increased performance in a game that often translates to a 10% increase with my dual cards.

In terms of SLI AA a good example would be "Last Remnant" from Square Enix. It, and many other shoddy console ports, only require 50% usage from my cards to max them completely. Running SLI AA -- again, we're talking 8x and **16x supersampling** (this translates to 16x, 32x supersampling since both cards render every frame seperately in SLI AA mode). I can run this level of AA and get either the same framerate or often EVEN HIGHER because rending that much AA forces both of my cards to 99% regardless of the game I'm playing.

Interestingly, another easy way to do this is by enabling 3D vision with just about any title. It almost always forces 99% usage on both GPUs if it's a fully 3D game. Again I wish Nvidia would make this happen all the time. I can run 3D vision in many games with, again, the same frames or increased.
    
CPUGraphicsRAMHard Drive
4790k EVGA GTX 980 Ripjaws 1866 Intel 730 
CoolingMonitorCase
silent wings ASUS Vga 236h define r5 
  hide details  
Reply
    
CPUGraphicsRAMHard Drive
4790k EVGA GTX 980 Ripjaws 1866 Intel 730 
CoolingMonitorCase
silent wings ASUS Vga 236h define r5 
  hide details  
Reply
post #6 of 8
Thread Starter 
TLDR I couldn't be happier with my setup. A lot of people told me not to go dual-graphics, "especially with an AMD CPU."

At stock this CPU definitely bottlenecks me especially, again, in shoddy console ports that like to run poorly on just 2 cores. But I can hit 3.8ghz with almost no vcore increase, and it's enough to push the cards to max regardless. BF3 is particularly amazing, using all 6 cores at around 70-90% most of the time. I really hope microsoft announces their new console at E3 this year, and that its CPU has at least 4 physical cores.

This would really mean great things for us PC gamers.
    
CPUGraphicsRAMHard Drive
4790k EVGA GTX 980 Ripjaws 1866 Intel 730 
CoolingMonitorCase
silent wings ASUS Vga 236h define r5 
  hide details  
Reply
    
CPUGraphicsRAMHard Drive
4790k EVGA GTX 980 Ripjaws 1866 Intel 730 
CoolingMonitorCase
silent wings ASUS Vga 236h define r5 
  hide details  
Reply
post #7 of 8
Quote:
Originally Posted by LesPaulLover View Post

What I've found is that, like I said...it's always console ports (especially poor ones, of course) that don't properly utilize my SLI. I'm sure you know how terrible Skyrim was when it released. Not just in terms of GPU of course, but on my system, it was **PISS POOR**
I'm getting literally DOUBLE the frames now, than I was getting when Skyrim released -- most of this in the past 3 months or so. When Skyrim released I was getting 30-60 fps (outdoors and in cities) ... (40-120fps indoors) Now I get 60-120 frames in the wild, never dip below 45 even on the infamous FPS-crushing steps of Dragonreach, and i get a constant 120+ fps indoors.
It was a joke that Bethesda released their game in such a condition. I only had 20 hours played by february of this year when i tried it again. It's finally the game it should have been thanks mostly to nvidia. Their next driver offers further increased Skyrim performance as well -- the beauty of SLI I'm finding. When Nvidia releases a driver that offers 5% increased performance in a game that often translates to a 10% increase with my dual cards.
In terms of SLI AA a good example would be "Last Remnant" from Square Enix. It, and many other shoddy console ports, only require 50% usage from my cards to max them completely. Running SLI AA -- again, we're talking 8x and **16x supersampling** (this translates to 16x, 32x supersampling since both cards render every frame seperately in SLI AA mode). I can run this level of AA and get either the same framerate or often EVEN HIGHER because rending that much AA forces both of my cards to 99% regardless of the game I'm playing.
Interestingly, another easy way to do this is by enabling 3D vision with just about any title. It almost always forces 99% usage on both GPUs if it's a fully 3D game. Again I wish Nvidia would make this happen all the time. I can run 3D vision in many games with, again, the same frames or increased.


Oh, I know exactly what you're talking about w/Skyrim
.

However, it wasn't nV that fixed it, it was Bethesda ... the problem was lack of CPU optimizations. The game was getting badly CPU BN'd in many areas. I proved it in the thread above, and by the time the fix came from Bethesda, it was pretty well known what the issue was.

CPU BN is actually very commonly the reason why SLI doesn't get 99% usage on both cards, in part because it adds a significant CPU overhead by it's nature ... but of course it's also possible for the combination of game and drivers (esp. the SLI profile) to not really 'get along' all that well ... sometimes it takes nV a while to get the profile right for the game, or sometimes they never bother.

But aside from problems with the game code requiring excessive CPU usage (as with Skyrim), the 'problem' with low GPU usage in SLI isn't 'shoddy ports', per se. Games basically either 'support AFR', or they don't. Beyond that, it's up to nV to make it work right, not the game maker.

Games played over the internet are also notorious for GPU usage issues, particularly in SLI ... again, typically it's CPU BN there as well, ultimately.

Any game that uses PhysX will reek havoc on your gpu usage, at least what you see in the AB graphs, in part because the 'usage' from physX will not show up in the graphs (it's not 'counted') unless you're using a dedicated card. Also, physX only runs on one GPU, which causes the other card in your SLI set to have to slow down to match the rendering speed of the card that's also doing physX.

So there's really a number of causes of <99% gpu usage in SLI, and it's rare that the issue is because the game maker made a 'shoddy port'. They may not have made it 'difficult enough to run' to be able to max your GPU's without maxing your CPU first, but that's usually the extent of their culpability wink.gif


Now, as I said above SLI AA has it's occasional uses, and if you can max out a game w/o ever seeing over 50% usage on either card, that might be a scenario where it has some benefit.

But what's really happening in SLI AA is that you're (effectively) just running the game on one card, because both cards are working on the exact same frame, in sync, rather than alternating like regular SLI. So each card is getting maxed out because they're both just working as single cards would work ... with the exception being that each is doing 1/2 of the AA load that you've set (if AA even works for the game, at the level you've set it at), and the two images are blended prior to display.

Like I say, it's also very likely that SLI AA isn't even doing anything (afa actually applying any AA) in many games you're 'forcing' it on, because 'forced' AA is not guaranteed to work, nor is transparency SS or MSAA ... in fact, they DO NOT work ... much more often than they DO.

So if you get an FPS increase by turning on SLI AA, it's because either 1) you've actually turned off the game's AA by forcing a mode it doesn't actually support, or 2) you've helped alleviate a CPU BN by taking the cards out of SLI, which reduces CPU overhead. The latter is the more likely case with an old, easy-to-run game like Last Remnant. If you're 100% CPU BN'd, then SLI can be slower than single-card, due to the SLI CPU overhead, IOW.

One way to check if the 'AA" part of SLI AA is working is to run the game with 32x SLI AA and no TRSSAA, and note your FPS. Then try with 64x SLI AA and 16x SLI TRSSAA, and again note FPS. Did the FPS stay the same? If so, then enabling SLI AA isn't even doing anything aside from basically running the game on one card, and making you think both cards are 'maxed out' because they are both just doing the same frame in sync, not alternating like SLI/AFR... in which case, it's not performing the magic that you think it is ... you can accomplish the same thing by running the game on just one of your two cards wink.gif

Edit: just did a bit of research on the Last Remnant, and it seems forcing AA only works if you rename the .exe to UT3 ... that may be old info, but give it shot. See if you get the same magic from SLI AA ... when AA actually works in the game.

Both these reviews say you can't force AA in this game through regular means, but the 2nd says you can by renaming the .exe (in which case you'll need to force the SLI AA for UT3 profile, not The Last Remnant profile):
http://www.tomshardware.com/reviews/radeon-5770-overclocking,2473-11.html
http://www.pcgameshardware.com/aid,679961/Unreal-Engine-3-tested-The-Last-Remnant-graphics-card-review/Practice/
Edited by brettjv - 4/10/12 at 9:43am
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
post #8 of 8
Okay, so I got curious about the phenomenon of which you speak, and decided to run some tests on the Last Remnant Demo, which I downloaded from Steam. I couldn't be arsed to run full benches, but took measurements at the beginning of the demo that I thought I'd share with you ... even though nobody cares.

First off, every configuration I tried ran at a minimum of 95% usage on both cards, as you'll see in the screenies. Here's the configurations I tried, and their resulting FPS. Note that the 32x readings are duplicated as I'm trying to make certain specific comparisons below. Also, it's clear from my results that forcing AA (SLI or otherwise) does indeed work (at least in the demo) of this game.

Note that all in-game settings were maxed out, and tests were run at 1920x1200.

No AA, SLI Scaling Tests
Single Card, No AA: 139fps
Two Cards (in SLI), No AA: 267fps

Turns out that for a 'shoddy' port, the SLI scaling looks pretty dang good in this game wink.gif


32xAA + 8xTRSSAA SLI Scaling Tests
Single Card, 32xAA + 8xTRSSAA: 72fps
Two Cards (in SLI), 32xAA + 8xTRSSAA: 140fps

And once you crank up the good old-fashioned AA, the SLI scaling looks phenomenal in this game tongue.gif


SLI AA vs Regular AA Tests
Single Card, 32xAA + 8xTRSSAA: 72fps
Two Cards (using SLI AA), 32x(SLI AA) + 8x(SLI)TRSSAA: 66fps EDIT: Oops, I did 'Q' level AA on this test ... that's probably why it's lower tongue.gif
Two Cards (in SLI), 32xAA + 8xTRSSAA: 140fps

In this comparison we look at EQUAL levels of AA, accomplished on either a single card, or by breaking my SLI set and instead using 'SLI AA', or by just using SLI. As you can see, the same level of AA is actually more efficiently accomplished using one card, as opposed to using SLI AA on two cards Edit: I accidentally did 32xQAA on the SLI AA test. Single card performance is very close to SLI AA, but using 'real SLI' blows both the single-card and the SLI AA solutions away.

Final Test with maxed out SLI AA, just to see what happens:
Two Cards (using SLI AA), 64x(SLI AA) + 16x(SLI)TRSSAA: 64fps

Looks like we can double our highest level of AA using SLI AA without much of a hit vs. the 2nd highest level ... basically that is the only thing that SLI AA has 'going for it'.


So, in closing, SLI actually scales wonderfully in this game, and SLI AA performs pathetically over running actual SLI ... in fact it doesn't even work better than a single card in terms of FPS while achieving the same level of AA.

Single Card, No AA: 139fps
375

Two Cards (in SLI), No AA: 267fps
375

Single Card, 32xAA + 8xTRSSAA: 72fps
375

Two Cards (in SLI), 32xAA + 8xTRSSAA: 140fps
375

Two Cards (using SLI AA), 32x(SLI AA) + 8x(SLI)TRSSAA: 66fps
375

Two Cards (using SLI AA), 64x(SLI AA) + 16x(SLI)TRSSAA: 64fps
375
Edited by brettjv - 4/10/12 at 11:38pm
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
Overclock.net › Forums › Graphics Cards › NVIDIA › SLI GPU Usage Question!