Originally Posted by TopicClocker
I see exactly what you mean!
I'm going to try and upload a video today, when I was testing I was also recording using Shadowplay whilst running MSI Afterburner to monitor VRAM, RAM usage, frame-rate and frame-time in the Rivatuner On-Screen display.
I ran 1920x1080 Ultra settings, 2560x1620 Ultra settings and 3840x2160 Ultra settings.
I ran the benchmark each time I changed the resolution and noticed that the VRAM usage wasn't willing to budge above 3.5-3.6GB even at 4K, so I thought "huh, that's kinda weird?"
I know how VRAM works and a card will sometimes use more VRAM than is needed or even not use that much, however it seemed quite odd that the VRAM usage didn't increase much if at all even when I ran it at resolutions much higher than 1080p.
From what I've read is that other people have tested games on both cards and the GTX 980 or other 4GB cards will allocate or use over 3.9GB or 4GB.
But then when I got into the game it was alot different.
The frame-rate at 4K was between 20-30fps, at 2560x1620 it was around 30-40fps+ and at 1920x1080 it was 50-60+ fps all of the tests was done with the GPU at stock, with a hint of CPU-bottlenecking past 60 fps.
At 4K I had seen about 3.8-4GB of VRAM usage, and at 2560x1620 I had seen 3.8-3.9ish, and at 1920x1080 3.8-3.9ish aswell IIRC.
4K wasn't smooth understandably trying to run that insane resolution on a single card at Ultra settings, 2560x1620 was alot smoother, and 1920x1080 was flawless, I ran the test twice so I have two sets of gameplay.
I see what you mean about "extreme conditions" as I suppose you could say 4K is kinda of pushing it on a single card, 2560x1620 is sort of too, I would have ran 2560x1440 but I'm not sure why it wasn't showing up in the resolution options.
I'm going to play Dead Rising 3 and Assassin's Creed Unity.