Overclock.net banner

1 - 14 of 14 Posts

·
Registered
Joined
·
1,992 Posts
Discussion Starter #1
Hi all, I this is just something I'm curious about. Do they generally build a video card with the maximum amount of vram that the gpu can make use of? Or is it like most prebuilt computers, where the cpu could almost always handle more ram? It's something I always eventually run into, so I've been thinking about making sure that I always buy the card with the max possible amount of vram from now on.
 

·
Vermin Supreme 2020
Joined
·
25,774 Posts
hmmm...



I've yet to see a game use all the VRAM on my 2080ti in 3440x1440p120hz, or even over 8gb, but you could easily use it all if you start doing any sorta heavy lifting with it outside of gaming.





in my experience, GPUs tend to run outta memory bandwidth before they run out of core processing power or memory allocation, once you get into the high count models (8gb+)
 

·
Registered
Joined
·
1,536 Posts
No, AMD and Nvidia do not put the maximum amount of VRAM on the board. See below for a 2080ti, which has 11 GDDR chips with 1 GB each. Look at the missing space where they could have added 1 more GB.

Gaming graphics cards top out at 8 GB because that is all games need. The 2080ti has 11 GB, which is great for special use cases like modded Skyrim when I pile on the 8k texture packs. Do you run mods? If not, then 8GB is plenty. The AMD Radeon VII has 16 GB, which is unnecessary for games, but that is really a compute card that AMD is trying to pass along as a gaming card.

Nvidia Tesla cards can have over 20GB, but they are mainly for compute.
 

Attachments

·
Graphics Junkie
Joined
·
2,468 Posts
Years ago I met a guy who worked at Nvidia and he had a Geforce 460 with 14GBs of ram on it for some sort of specific task. That said, most GPU's won't run out of vram until they are doing something that the GPU probably wouldn't keep up with anyway, at least as far as gaming. A 970 would easily run out of Vram in certain games today if your playing in a high res but if you had say a 1070, there is really nothing you could do with any game to run out of vram unless you started pushing the resolution beyond 4k at which point the GPU wouldnt keep u anyway.
 

·
Registered
Joined
·
1,992 Posts
Discussion Starter #5
Yeah I noticed it when playing Doom Eternal. The 970 just didn't have enough Vram at 3.5Gb. The settings were only on medium for that game, they could've gone much higher with the right card/s. But I wanted to ask since it looked like those settings could be doubled. This is usually the time I buy a new card, when it runs out of Vram. Up until that point, I've noticed that I can run most things on max settings with a few things turned off that I don't like. My previous card was a GTX 570 which ran out of Vram when I tried to play Grand Theft Auto 4. Had to replace it then.

Should I wait for the RTX 3070 or whatever replaces the 2070?
 

·
9 Cans of Ravioli
Joined
·
19,192 Posts
lol

you need to look at the bus width to figure out what the card could handle.

1050 Ti = 128-bit memory width, 4x 32-bit. If you use 1 GB modules, you'd get 4 GB. If you used 0.5 GB, you'd get 2 GB. No configuration would get you 7 GB (since AFAIK nobody makes 1792 MB DRAM ICs).
(OG) 1060 = 192-bit memory width, 6x 32-bit. If you use 1 GB modules, you get 6 GB. If you used 0.5 GB, you'd get 3 GB. No configuration would get you 11 GB (since AFAIK nobody makes 2816 MB DRAM ICs).
 

·
Registered
Joined
·
3,745 Posts
I've seen one game use nearly all the 11GiB of memory on my 1080Ti: Resident Evil Biohazard, a sometimes disgusting game that turns my stomach and has lousy PC controls for a shooter.
 

·
Registered
Joined
·
1,992 Posts
Discussion Starter #8
hmmm...

I've yet to see a game use all the VRAM on my 2080ti in 3440x1440p120hz, or even over 8gb, but you could easily use it all if you start doing any sorta heavy lifting with it outside of gaming.

In my experience, GPUs tend to run outta memory bandwidth before they run out of core processing power or memory allocation, once you get into the high count models (8gb+)
Do you think it would be pointless to upgrade to an rtx 2070? When I'm still using an I5-3570k.
 

·
9 Cans of Ravioli
Joined
·
19,192 Posts
are you GPU bottlenecked? if no, then yes it'd be a pretty pointless upgrade. if yes, then go for it. but I'd probably track down a 3770K for cheap along with it, surely they can be had for reasonable prices nowadays.
 

·
Registered
Joined
·
1,992 Posts
Discussion Starter #10
I'm not sure. It seems like a 2080ti would run Doom Eternal a lot better than a gtx 970, even when paired with an old 3570k. But I guess there's no way to know until I try.
 

·
9 Cans of Ravioli
Joined
·
19,192 Posts
:confused:

there is a way to tell, look at CPU utilization and GPU utilization on your current hardware while playing games. is one pegged near 100% while the other is not? then whatever is pegged at 100% is the bottleneck.

if you are GPU bound, then an GPU upgrade will help. if you're CPU bound and the GPU is more or less just sitting there, then no, it'd be a waste.
 

·
Registered
Joined
·
1,536 Posts
I'm not sure. It seems like a 2080ti would run Doom Eternal a lot better than a gtx 970, even when paired with an old 3570k. But I guess there's no way to know until I try.
Doom Eternal is not CPU intensive, so your 3570k would pair well with a 2070 for that specific type of game: one-player FPS. If you are gaming at 1080p, I wouldn’t get anything above a 1660 Super. Have you heard of the EVGA Step Up Program? You can get a 1660 Super or 2060 Super today and upgrade to any superior graphics card within 3 months. You have to pay the difference in cost for the more expensive card. Since new cards will be released within 45 days, you will be qble to upgrade to a 3080.
 

·
Registered
Joined
·
2,723 Posts
Not sure about Doom Eternal, but presuming it's similar enough to Doom (2016), my Core i5 2500K clocked lower than your CPU handled it fantastically with a GeForce GTX 1060 6 GB.
I've seen one game use nearly all the 11GiB of memory on my 1080Ti: Resident Evil Biohazard, a sometimes disgusting game that turns my stomach and has lousy PC controls for a shooter.
Resident Evil and Biohazard are sort of interchangeable/localization differences in the naming. Which specific Resident Evil game was it (or is there actually one called "Resident Evil Biohazard" that I'm not aware of?)? If you're referring to one of the recent remakes, like Resident Evil 2 and 3, those are known to vastly, vastly overestimate your GPU use in the settings menu. So much so, so that's it's often recommended to set your texture settings very high even if it puts you over your VRAM budget, as the texture size also seems to act as a cache which counteracts the stuttering issue the game sometimes suffers from due to streaming assets in as you move around. I have it set to where it guesstimates I'll use like 8 GB or 9 GB or something, but my 6 GB GPU uses around 3 GB or 4 GB (haven't seen it over 5 GB yet). If you actually used a monitoring program and it did use that much, I'd be interested in knowing which Resident Evil game it was. If it was one of the recent two I mentioned, it probably just uses as much as it can as a cache, but it doesn't NEED that much.

Also, I loved the controls in the remakes (if one of those is what you're referring to). They did well to take an old game design with tank controls and upgrade it to a modern feel while still feeling as close as possible to what it was.
 

·
Registered
Joined
·
1,992 Posts
Discussion Starter #14
Doom Eternal is not CPU intensive, so your 3570k would pair well with a 2070 for that specific type of game: one-player FPS. If you are gaming at 1080p, I wouldn’t get anything above a 1660 Super. Have you heard of the EVGA Step Up Program? You can get a 1660 Super or 2060 Super today and upgrade to any superior graphics card within 3 months. You have to pay the difference in cost for the more expensive card. Since new cards will be released within 45 days, you will be qble to upgrade to a 3080.
Thanks, this reply was very helpful. I'll definitely check that out. I also wasn't aware that doom wasn't very cpu intensive, or that the new cards would be released soon. I've been out of the loop for quite a while.
 
1 - 14 of 14 Posts
Top