Originally Posted by infodump;15201354
Thanks, I guess I'll leave em low unless I fail WU's or start getting artifacting, then I'll set them back to same as the new one. Played crysis for a while and it ran perfect
Little hot though...
So far 2 out of the 4 games I've played don't correctly support sli..is this a problem you run into this much? The witcher 2 runs both cards @ like 50-60% usage, and Rage just leaves the 2nd card @ 0%. (Yeah I know, consoles don't support 2 video cards why should rage?!)
Rage being a console game actually has nothing to do w/SLI not working. I play TONS of console ports and SLI works perfect on all of them
The actual issue is that the drivers you're using just don't have an SLI profile for Rage cause it's such a new game. If you need the extra SLI performance, I'd check for beta drivers (check release notes to see if there's anything about Rage support), or just wait til the next WHQL set, which I'm certain will have added the SLI profile for Rage, then SLI will work.
AFA Witcher 2 goes, SLI is totally supported, it just has randomly low GPU usage in a certain areas when running SLI. If it wasn't you'd get 0% on the 2nd card, like you do in Rage.
It's gotten better over time w/patches and driver improvements though, and generally the game runs really well at max settings (no uber) and only occasionally dips down into the 30's when in 'camps' with tons of NPC's and firepits and stuff. At these times GPU usage also falls.
I'm pretty certain that it's a cpu bottleneck causing low GPU usage in these specific areas. Lots of NPC's = lots of extra calculations. Fire/smoke also takes a lot of computation power. And I believe that Witcher 2 only makes full use of 2 cpu cores, so that doesn't help the situation.
But any time you have <99% it can also just be a bottleneck that's internal to the card ... memory bandwidth or pixel fillrate could be getting tapped out, for example. Remember GPU usage is just measuring one subsystem of your card, not the entire thing. So lower than 99% gpu usage can be caused by bottlenecks both external (cpu/platform) OR internal (memory bandwidth, fillrate, etc). It can also be caused by driver glitches with the particular game (perhaps causing excessive texture swapping within the vram, or back and forth w/system memory) .
Bottom-line you can drive yourself batty worrying about why you're not getting 99% gpu usage ... and you can save yourself many many hours of wasted life by NOT worrying about it, beyond OC'ing your CPU as much as possible to help remove the most common source, and keeping your game patched and drivers up to date. Using highest possible settings usually helps as well, but not always due to possible internal bottlenecks like I was talking about above.
Also remember that any type of framerate cap (such as using v-sync, or caps built into the game itself) can cause <99% usage ... the reason for this should be obvious if you think about it Edited by brettjv - 10/6/11 at 9:41am