Originally Posted by Thracks
In an ideal world, the GPU will swap data out to system memory at a significant performance penalty. That's the best outcome you can hope for. If the game is not designed to handle this correctly, you will crash to desktop.
Or drivers not catching it either...
What constitutes a video card running out of memory, now a days, is basically drivers precaching assets and textures. While some games have the physical ability to store large amounts of data in VRAM, not all of it is always essential, i.e. can't play this game without this much memory etc. Past AA and textures, the majority of VRAM is data you will be rendering soon.
I can run Windows 7 on 1GB of DDR3 (minimum recommended) or 32GB like I actually am. Windows 7 still runs on both. Obviously though, like an operating system, the transition between tasks is that much easier when memory constraints are not in place. Does that make sense?Edited by RagingCain - 3/3/13 at 2:41pm