Originally Posted by Bit_reaper
Uhh. I got to ask. Are you 100% positive that is the gpu's vram and not the main system memory (aka RAM). As far as I know any time you exceed the vram available on your graphics card your FPS should drop like rock.
Yes, I am 100% sure. To clarify, however, that screenie was taken with the game running on my GTX670, not my old 470's in SLI
And no, that is not true.
Well ... let me clarify ... like I said in my post, the numbers people think of a 'usage' that they see in Afterburner or whatnot, that number is actually a report of memory reserved by the driver for use by the application. The actual amount of vram 'used' to construct the frame that you're looking at is never calculated and hence is never reported to you as the user ... but if it were, it would be far lower (and WAY more variable) than the amount you see reported in Afterburner.
The more memory you have available, the more memory space the driver tends to 'reserve'. This basically means that the 'local pool' of quickly-available textures (those presently contained in the card's vram) is a larger pool, i.e. contains more textures. More 'local' textures in turn reduces the LIKELIHOOD that any given texture that's 'needed' (i.e. one that is part of the frame that is currently being rendered) will need to be fetched from system RAM in real-time as the frame is actually being constructed. Those fetches are actually what tanks your performance, esp. if there's a lot of them at once.
Actually a lot of what defines how much vram you need for a given game comes down to the efficiency of the texture caching system ... i.e. how efficiently does the driver pre-calculate which textures are likely to be needed soon, fetch them in the background before they're needed, whilst also dropping out textures that are no longer imminently needed (if need be ... if you have a plethora of vram, this doesn't need to happen ...)
Some games really have no such pre-caching systems aside from what happens when the level is first loaded. In that case, if you lack sufficient RAM to store all the needed textures for the level, there's a lot higher chance there will need to be real-time fetches from system ram. But most game have such a mechanism.
Basically, the whole question is WAAAAAY more complex than "once a game 'uses' more vram than what you have, your perf tanks". That's esp. true since you never even know your 'vram usage', because what you're looking at in afterburner and thinking of as usage ... is really NOT usage.
If it were, then when you looked at a single-texture wall on a rooftop in a game, your 'usage' would be at like, some really low number, then when you turned around to face a massive city-wide battle scene, your usage would skyrocket due the scene's additional complexity. But that doesn't happen. And that's because current real-time vram usage is not what's actually being reported. It's more like a measure of 'what is the collective size of all the textures that I anticipate I will need in the next 5 minutes of gameplay?'
The only way to know for sure whether your perf is being hindered by vram capacity is to bench with a card that's otherwise identical to yours ... but that has a lot more vram