Originally Posted by Darren9
Is there any evidence to support this? I *thought* VRAM was paged out when filled (with performance loss) rather than the software adapting somehow to "fit" into available memory.
What you think of as a measurement of vram 'usage' (such as the one that's visible in Afterburner) is not a 'bare-metal' HW-level measurement, it's actually a software-derived measurement of amount of vram 'reserved' by the driver for the application. Think of it as the driver saying 'this is how much vram I think I need for this app right now'. It is NOT necessarily 'right', and in fact will often over-estimate to err on the side of caution.
It's a common mis-perception that many people have that 'once you use up your vram, as seen in Afterburner, you start using system ram in a paging capacity and your fps takes a big hit'. That's not exactly 'how it works'.
It's really not until some decent-sized chunk data that needs to be in your vram/framebuffer at any particular point in time in order to render the frame ... is not present in your vram (and must hence be fetched from your system ram in real-time) that your perf tanks.
However, depending on the game, the textures (which is a fair amount of what's stored in your vram) can be streaming in and out of your system memory to your vram all the time ... although hopefully it's going on in the background such that the right textures are pre-fetched aka cached in the vram to be readily available when they're actually needed.
IOW, just because Afterburner says you are 'using' 1.5GB of vram, that does NOT mean that the entire 1.5gb of data is needed to render every single frame at that moment. There's all kinds of cached data occupying memory that's just 'expected to be used soon'.
So an app can be coded in a way where it just says "hey, lets reserve ALL the available memory and store as many textures as possible in vram all the time!' ... but that amount of vram usage may not strictly be necessary in order for the game to run perfectly. As long as your frame buffer itself can all fit in your vram, then whether or not your perf tanks depends on whether the caching mechanism between system and vram can keep the immediately needed textures available for each frame.
Soooo ... The total picture is not NEEEEEARLY as simple as "AB says I'm 'using' 1.5GB, therefore a card with only 1280MB of vram is going to fall on it's face in this scenario". Whether or not this happens very much depends on the interaction between driver and game code, how close the approximation between what's 'reserved' and what's 'needed right now' actually is, and the efficiency of the caching that's going on.Edited by brettjv - 3/27/12 at 2:30pm