Overclock.net banner

3GB VRAM not enough anyore?

8K views 69 replies 29 participants last post by  PontiacGTX 
#1 ·
EDIT:I'm reviving this thread to see if the opinions had change, what do you think now guys? are 8GB cards gonna be the norm soon? will the newcoming games start requiring 6GB+ of memory?

With these next gen games requiring 3GB or more VRAM like titanfall and watchdogs, will this going to be the norm? Do we really have to upgrade to 6gb GPUs soon in order to max out textures on next gen games?

I'm kinda worried because i just bought a dual gtx 780 which i tought will be futureproof.
 
#2 ·
Most recent rumor is that the GTX 880 will have 8GB of VRAM. New consoles have made it so current GPUs will likely go obsolete quickly.. on a positive note this likely means we'll see a good boost in performance from the next line of GPUs.

I actually did the same as you a week ago, and bought 2x 780 Tis. I ended up returning one because of this. I personally don't see 3GB of VRAM being enough to max games on 1080p, let alone 1440p, in a year from now.
 
#3 ·
Quote:
Originally Posted by ghostrider85 View Post

With these next gen games requiring 3GB or more VRAM like titanfall and watchdogs, will this going to be the norm? Do we really have to upgrade to 6gb GPUs soon in order to max out textures on next gen games?

I'm kinda worried because i just bought a dual gtx 780 which i tought will be futureproof.
There is nothing that is truly "future proof" , but 3GB should be sufficient for years to come.
 
#5 ·
Quote:
Originally Posted by Murlocke View Post

Most recent rumor is that the GTX 880 will have 8GB of VRAM. New consoles have made it so current GPUs will likely go obsolete quickly.. on a positive note this likely means we'll see a good boost in performance from the next line of GPUs.

I actually did the same as you a week ago, and bought 2x 780 Tis. I ended up returning one because of this. I personally don't see 3GB of VRAM being enough to max games on 1080p, let alone 1440p, in a year from now.
ps4 only has 3-4gb of ram available for games so no new gen games will be made with the ability to use more then 3-4gb vram at 1080p. they may make 8gb cards for the purpose of 4k gaming.
 
#7 ·
I was playing at 1440 and 3gb was enough for almost every game. Playing at 4k now and 3gb is still enough except for a couple games or if I turn on more than 4x anti aliasing. So yeah, 3gb will be enough especially for 1080p for another couple years I'm guessing. If you're really worried you can always get a 290x on the cheap now.
 
#8 ·
Quote:
Originally Posted by Murlocke View Post

Most recent rumor is that the GTX 880 will have 8GB of VRAM. New consoles have made it so current GPUs will likely go obsolete quickly.. on a positive note this likely means we'll see a good boost in performance from the next line of GPUs.

I actually did the same as you a week ago, and bought 2x 780 Tis. I ended up returning one because of this. I personally don't see 3GB of VRAM being enough to max games on 1080p, let alone 1440p, in a year from now.
Quote:
Originally Posted by ghostrider85 View Post

Will the gtx 880 comes with 8gb vram for sure? Or it's just speculations?

Oh well, i guess i'll just sell these cards and get the gtx 880 8GB when it comes out, i should have plenty of time to save up.
Speculation. Rumors also point to new AMD cards with 3GB-4GB VRAM at most due to the limits of HBM, and the GTX 880 will likely be a mid-range Maxwell card (like the 680/770), so I can't imagine it shipping with over 4GB. Big maxwell will probably ship with 6GB, or maybe 8GB if it uses a very wide bus... But that's coming later.

Though some games will use over 3GB of memory at 4k atm, that doesn't mean they need it... Ideally, the memory manager should automatically scale VRAM usage with the GPU's available memory.
 
#11 ·
Guys, comparing games that came out before the new consoles is totally different. They coded and optimized those games around much lower VRAM levels due to the 360/PS3 being the limiting factor. The next gen titles are coded and optimized to utilize massive amounts of VRAM, developers are lazy and I wouldn't expect them to change this with PC ports.
Quote:
Originally Posted by vlps5122 View Post

ps4 only has 3-4gb of ram available for games so no new gen games will be made with the ability to use more then 3-4gb vram at 1080p. they may make 8gb cards for the purpose of 4k gaming.
Sorry but you are missing key points.

- Many PC gamers here are not going to run 1080p, they'll be on 1440p or higher.
- Many PC gamers will crank up AA levels.
- Many console games don't run full 1080p.
- Many console games run with much lower resolution textures.

They are hardcoding games to use 3-4GB of VRAM on consoles with reduced resolution, textures, and minimal to no AA levels. You throw in 1440p, 4x MSAA, and higher textures.. you easily break 3-4GB of VRAM. (See: Watch Dogs)
 
#12 ·
Quote:
Originally Posted by Murlocke View Post

Sorry but you are missing key points.

- Many PC gamers here are not going to run 1080p, they'll be on 1440p or higher.
- Many PC gamers will crank up AA levels.
- Many console games don't run full 1080p.
- Many console games run with much lower resolution textures.

They are hardcoding games to use 3-4GB of VRAM on consoles with reduced resolution, textures, and minimal to no AA levels. You throw in 1440p, 4x MSAA, and higher textures.. you easily break 3-4GB of VRAM. (See: Watch Dogs)
I will stick to 1080p for the time being, and i almost never use AA.
 
#13 ·
Quote:
Originally Posted by ghostrider85 View Post

Quote:
Originally Posted by Murlocke View Post

Sorry but you are missing key points.

- Many PC gamers here are not going to run 1080p, they'll be on 1440p or higher.
- Many PC gamers will crank up AA levels.
- Many console games don't run full 1080p.
- Many console games run with much lower resolution textures.

They are hardcoding games to use 3-4GB of VRAM on consoles with reduced resolution, textures, and minimal to no AA levels. You throw in 1440p, 4x MSAA, and higher textures.. you easily break 3-4GB of VRAM. (See: Watch Dogs)
I will stick to 1080p for the time being, and i almost never use AA.
Then 3GB is plenty, but 1080p/no AA is kinda lame
tongue.gif
. At least use some SMAA.

Most new games (like Watch Dogs) still have to run on the 360/PS3 and older DX11 PC hardware. Future games may eat up alot of VRAM at absolute max settings, but will nicely scale down to lesser cards.
 
#14 ·
Quote:
Originally Posted by ghostrider85 View Post

I will stick to 1080p for the time being, and i almost never use AA.
Then you likely won't have VRAM issues with 3GB, but I don't see why you'd buy 2x 780s and run 1080p with no AA. That seems like a sin to me.
tongue.gif
 
#15 ·
Quote:
Originally Posted by Murlocke View Post

Quote:
Originally Posted by ghostrider85 View Post

I will stick to 1080p for the time being, and i almost never use AA.
Then you likely won't have VRAM issues with 3GB, but I don't see why you'd buy 2x 780s and run 1080p with no AA. That seems like a sin to me.
tongue.gif
That's what I thought, dual 780s deserve 1440p with some light AA.
 
#16 ·
I don't think we will be seeing consumer priced 8GB vram cards soon.
A Tesla Nvidia card easily costs +$3K and it has 12 GB of Vram.

There is no way the average consumer is going to be paying consistently $800 for a new card. Only a small % of people buy the top priced cards.
Even if a mid-range card had 6 GB of vram on it the card would not be powerful enough to use it, as we see with a GTX670 and 4 GB of Vram - you need SLI to have enough power for the card to use close to 4 GB's.

Watch Dogs is just another case of the Dev porting the game and lack of optimization.
Packaging a game with a bunch of non-compressed textures to inflate the memory requirement is just lazy.
I think it insane to expect to have huge Vram cards considering the costs and usage.
 
#17 ·
Quote:
Originally Posted by Aparition View Post

I don't think we will be seeing consumer priced 8GB vram cards soon.
A Tesla Nvidia card easily costs +$3K and it has 12 GB of Vram.

There is no way the average consumer is going to be paying consistently $800 for a new card. Only a small % of people buy the top priced cards.
Even if a mid-range card had 6 GB of vram on it the card would not be powerful enough to use it, as we see with a GTX670 and 4 GB of Vram - you need SLI to have enough power for the card to use close to 4 GB's.

Watch Dogs is just another case of the Dev porting the game and lack of optimization.
Packaging a game with a bunch of non-compressed textures to inflate the memory requirement is just lazy.
I think it insane to expect to have huge Vram cards considering the costs and usage.
Sadly, I think unoptimized PC games are the norm. Sure, we can tweak some ini files, inject custom AA, and run texture optimizers, but most people don't put that much effort into every game they play. Which is why alot of people think they need so much VRAM.
 
#19 ·
I have found 2GB at 1080p for todays games to be the bare minimum with 4x AA. Move up to 1440p with say 2x AA (don't really need more than that) and 3GB is the sweet spot. But I can't really speak for the future, things could be a lot different with the way 4k panels are being priced at $600-800 now.

I have a feeling a lot of people running 1080p right now, will opt to jump right to 4k instead of going to 1440p for their next monitor upgrade. That is when I can see 4GB of memory being the bare minimum in the future.
 
#20 ·
Quote:
Originally Posted by brucethemoose View Post

Sadly, I think unoptimized PC games are the norm. Sure, we can tweak some ini files, inject custom AA, and run texture optimizers, but most people don't put that much effort into every game they play. Which is why alot of people think they need so much VRAM.
This is true only for some games. There are many other games out there that look fantastic and don't require +3GB of vram
smile.gif


We should not have to tweak our games to play them, we should tweak our games because we want to customize them
thumb.gif


Watch Dogs and requiring a memory checker for Ultimate setting should not be a basis to buy a video card or set a foundation for an expected memory amount.
3 GB should last a long while yet, especially at 1080p.
Sure in 6 years when 4k resolutions are normal we will probably see 8 GB Vram, but we won't see the average consumer buying one in the next couple of years. It just isn't necessary yet and will only make GPU's more expensive for no reason.

There is also the difference between memory allocation and actual usage. I'd argue right now that %100 of memory tests it is allocation.
 
#21 ·
Quote:
Originally Posted by Murlocke View Post

Then you likely won't have VRAM issues with 3GB, but I don't see why you'd buy 2x 780s and run 1080p with no AA. That seems like a sin to me.
tongue.gif
well apparently witcher 3 only runs at 35-45 fps at 1080p with a 780 ti and everything maxed. i plan on keeping my 1080p even with 3 780 ti kingpins, i like maintaining 100+ fps nd g sync nd lots of high end titles incoming. ill upgrade to 4k 120 hz g sync and next overvoltable flagship cards when the time comes
 
#24 ·
3GB is still enough when lowering AA,the only game using up and stuttering my system is watch dogs,and we all know it's because it's not optimized properly....
 
#25 ·
Upcoming titles will probably require more VRAM and it is possible that Ubisoft repeats Watch Dogs Ultra texture requirement for its next titles as well. With games like Assassin's Creed Unity, Far Cry 4, The Division, Rainbow Six Siege and The Crew it would really be a shame not be able to enjoy these titles at their best. Other titles as The Witcher 3 and Batman Arkham Knight could also have larger VRAM requirements. It would be great to have the next GPUs featuring 6GB-8GB of VRAM but unfortunately in NVIDIA's case that would limit the appeal of the Titan line.
 
#26 ·
Using examples like "8xMSAA @ 1440p" or 3x3SSAA @ 4K as the logic behind advising someone to spend a SUBSTANTIAL amount ofmoney on GPU's, much less someone who has stated that 1080p is the highest resolution they are using and will be using for the foreseeable future, is like telling them that they need a Ferrari 599GTB for their daily commute on 65mph highways instead of a BMW M3, because the Ferrari can reach over 200mph while the M3 can "only" go 185mph. It's the kind of advice I see, both in the enthusiast PC world as well as in the semi-/pro auto racing world, given by people who don'thhave experience with the equipment, be it using 3GB 780Ti's @ 1440p for gaming or sitting behind the wheel of a full-race-dress Italian exotic at 170mph on the track.

While you can learn a lot from reading, there comes a point where you are unable to learn anything more withoutfirsthand eexperience.

That said, I think the combination of the advent of the first affordable 4K displays combined with the "new" consoles (specifically their being far closer to a PC in components than many generations of previous consoles), has caused a tremendous amount of FUD.

I game primarily on either a 21.5" 1080p or higher end 1440p (27") panels, and have a number of different GPU setups (670FTW/680LTG/780Ti Kingpin all 2-way SLI, 3x 580 3GB Classified Ultra, and in the past year have also had 2x7970LTG, 2xR9 290X LTG, and a few others).

There are fewer than half a dozen games that are truly limited by 2GB VRAM @ 1440p, and only two (Wolfenstein + WD) are unrelated to NORMAL levels of AA, frankly they're just garbage optimization. Everything else I've played is fine with 2xMSAA + FXAA/SMAA on 2gigs, which visually (1440p) is superior to 4xMSAA.

Running 8xMSAA @ QHD or higher is a waste of resources, has literally almost zero visual benefits over 4x which is not even necessary in most cases. Intelligent allocation of resources to image quality goes a long way, for example combining low (2x) MSAA with shader based SMAA, as the way each works is so different (plus the zero cost for FX/SMAA) that the end result is much more than the sum of the parts.

Excluding Fail Dogs, I have yet to see anything come close to really pushing 2x 780Ti's (Kingpin) @ "just" 1398/8100, but again I don't use more than 4xMSAA + FX/SMAA (rarely more than 2x), as I have not found a single benefit from doing so. It is a pure epeeen thing.

VRAM allocated DOES NOT EQUAL VRAM required/used.

My friend has 3x Titan Blacks, and at the same settings in the same game they'll indicate 3.8-5GB "in use" (1440p), yet when he runs 2/3 of the cards (same "in use" ie allocated VRAM), my "inadequate" 3gig cards (2.3-2.5 allocated) consistently perform 8-14 percent higher.
I saw the same thing before I ditched the 290X Lightnings. Higher indicated "used" memory, but with the 290X's at the highest clocks they could hit, they were still 9-14 percent slower than the lowly 3 gigs I have. Oh, and I was running "just" 1328/7800. At higher, but still not max, clocks of 1480/8200 the gap grows by a further 6-10 percent.

This is all from first hand experience, from equipment I spent my hard earned money on. Why would I have sold the (one worthwhile model of) 290X's and stuck with the Kingpins if they were worse in any way? I wouldn't have, because on the exact same system, the Kingpins are significantly faster.

But, if you require 8xMSAA (or the manly AA: 3x3SSAA) even at QHD + resolutions, I can't stop you. Just realize that 4 gigs is exactly as future proof as 3.... Which is ZERO!
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top