Overclock.net banner

141 - 160 of 162 Posts

·
PC Evangelist
Joined
·
46,706 Posts
8gb cards have no problem keeping up with 11/16gb in that game 1440/4K:

View attachment 369532

now you can stop the FUD.
2080 and 2080S are > Radeon 7 at Stock but sure does not like that at 4K. We are talking here about performance drops that are huge. We are talking about single digits. Single digits in some of these games can make or break a champion. Also, we have not even started this gen. A 3070 user is not the same as a 2080 Ti user or even 3080 user. They are not going to upgrade in 2 years. Think GTX 970. Will 8GB be good enough in 6 years? Yes it will once all games had DLSS in performance mode.
 

·
sudo apt install sl
Joined
·
7,305 Posts
2080 and 2080S are > Radeon 7 at Stock but sure does not like that at 4K. We are talking here about performance drops that are huge. We are talking about single digits. Single digits in some of these games can make or break a champion. Also, we have not even started this gen. A 3070 user is not the same as a 2080 Ti user or even 3080 user. They are not going to upgrade in 2 years. Think GTX 970. Will 8GB be good enough in 6 years? Yes it will once all games had DLSS in performance mode.
The memory bandwidth is double the 2080/Super and the 1080 Ti also has higher memory bandwidth than the 2080. I couldn't get Shadow of the Tomb Raider(4k) to use above 7.4GB unless I turned on FidelityFX so I'm pretty certain the memory bandwidth was the deciding factor in that game.

Regarding HUB's video.

Best way to test this is using a RX Vega 64 vs a Frontier Edition.
 

·
Overclocker
Joined
·
3,626 Posts
Are people actually making claims that this current generation of games = next generation of console ported, open world games using RT? Really?

And when you see Fortnite simulate a open world, ray traced game it's "demanding game"? Let me add proper context to that term for you. It's a demanding game that requires more vram!!!! So we use DLSS to lower the resolution to create a smaller memory footprint you can (among other things) increase the frame rate.

Here is an example of current gen console ported games:
Textures, Shadows, Depth of Field, LOD, Ambient Occlusion and Volumetric Lighting. There are also settings for Screen Space Reflections, VFX Quality, Enhanced Water Simulation and Enhanced Destruction. Additionally, you can enable Screen Space Contact Shadows, Tessellation, Lens Flares, Motion Blur, etc. That's not always open world or limited to a "per section" bases. Notice no RT elements.

Am I the only one not on the hype train? Chooo Chooo
 

·
Official Luddite of OCN
Joined
·
5,675 Posts
The memory bandwidth is double the 2080/Super and the 1080 Ti also has higher memory bandwidth than the 2080. I couldn't get Shadow of the Tomb Raider(4k) to use above 7.4GB unless I turned on FidelityFX so I'm pretty certain the memory bandwidth was the deciding factor in that game.
So a game literally launched days away from 2 years ago was at or near 8gb already... This isn't exactly helping the whole 'new consoles dropping / $699 gpu has less vram than $699 gpu from 2017' argument.
 

·
Registered
Joined
·
38 Posts
And when you see Fortnite simulate a open world, ray traced game it's "demanding game"? Let me add proper context to that term for you. It's a demanding game that requires more vram!!!! So we use DLSS to lower the resolution to create a smaller memory footprint you can (among other things) increase the frame rate.
I would say that the Fortnite presentation is a bad example. A 2.6x performance increase with DLSS performance mode is inline with what you can achieve today without being over the video memory limit. It might or might not be memory starved, no way to tell from that picture.
 

·
Overclocker
Joined
·
3,626 Posts
nvidia clearly stated:
the goal of the 3080 is to give great performance at up to 4K resolution with all the settings maxed out at the best possible price
extra memory is always nice but it would have increased the price of the graphic card, so we need to find the right balance
So the 3070 8gb and 3080 10gb is a cost saving sku to keep prices low for 3 reasons.
1. RDNA 2 is competing in this tier of gpu performance
2. Nvidia is also competing against $500 consoles
3. World Economy isn't favorable for these kinds of products.

IMO:
But most importantly Nvidia is getting pressure from AMD from GPU and Console Fronts! So, they will keep the 3070TI and 3080TI in reserve until it's clear they have to bring it forward at a price they can't compete against AMD if RDNA 2 happens to be on par with those TI variants. Which is why you are seeing rumors of more then just 3 skus from Ampere, oopsie!
 

·
sudo apt install sl
Joined
·
7,305 Posts
So a game literally launched days away from 2 years ago was at or near 8gb already... This isn't exactly helping the whole 'new consoles dropping / $699 gpu has less vram than $699 gpu from 2017' argument.
Yes, a discrete video card with 8GB of dedicated VRAM is enough for 4K gaming. The new consoles share the VRAM with the rest of the system. Not to mention developers are already annoyed they have to learn how to utilize the two memory pools on the series X.

Edit: Simple solution compare a stock RX Vega 64 against a Frontier Edition.
 

·
PC Evangelist
Joined
·
46,706 Posts
Yes, a discrete video card with 8GB of dedicated VRAM is enough for 4K gaming. The new consoles share the VRAM with the rest of the system. Not to mention developers are already annoyed they have to learn how to utilize the two memory pools on the series X.

Edit: Simple solution compare a stock RX Vega 64 against a Frontier Edition.
Vega is not fast enough to use over 8GB. Also doesn't Vega have HBCC which lest it use up to 20GB. There was a test in one game that Vega got to 20GB and beat 11GB 1080 Ti.

As Nvidia said it. They are the Apple of GPUs. They will find a way to make a GPU fast now and plan its retirement, unlike Pascal which they took 4 years to kill.

Enough vRAM
Working with Developer, in short telling them to limit the texture size.
 

·
professional curmudgeon
Joined
·
10,391 Posts
Discussion Starter #149
2080 and 2080S are > Radeon 7 at Stock but sure does not like that at 4K. We are talking here about performance drops that are huge. We are talking about single digits. Single digits in some of these games can make or break a champion. Also, we have not even started this gen. A 3070 user is not the same as a 2080 Ti user or even 3080 user. They are not going to upgrade in 2 years. Think GTX 970. Will 8GB be good enough in 6 years? Yes it will once all games had DLSS in performance mode.
you have shifted the goal posts so much, i cannot keep track. just where did it start?
Games like this make it almost impossible to recommend 8GB cards.
now your qualifiing it with champoins where is single digit matters as opposed to the general audience you made your sweeping generalization to.

if you want to talk 3070 buyers but ignore the $500 price tag? you really think its reasonable to demand this gen performance for half the costs of last? it's funny you mention the 970 because thats when golden tiger (you used to be here) found out it was going to cost more than $700 to beat the maxwell titans' $1000 4K performance.

yeah kept telling titan and 980ti owners they were getting ripped off until reality bit him the behind. so much so he had to run to hardOCP to complain. :yessir:

so i'm just too dizzy with your goal posts. have a good day.
 

·
mfw
Joined
·
8,621 Posts
Not at all, your "funny sarcasm" wasn't funny, nobody laughed... Move on.
I had moved on till you just reminded me of the sad exchange you created by not being able to detect very obvious sarcasm.

But, hey, cheer up. At least you're not EastCoast.
 

·
The Factory of the Cell
Joined
·
88 Posts
And when you see Fortnite simulate a open world, ray traced game it's "demanding game"? Let me add proper context to that term for you. It's a demanding game that requires more vram!!!! So we use DLSS to lower the resolution to create a smaller memory footprint you can (among other things) increase the frame rate.
But that's not why DLSS improves framerate though. It reduces the number of pixels the GPU has to render in the first place. It probably doesn't actually have a significant impact on VRAM usage. And that image, again, doesn't tell us that 10 GB of VRAM is insufficient for Fortnite at 4K max settings + RT. It just tells us that it's really difficult to render.
 

·
mfw
Joined
·
8,621 Posts
But that's not why DLSS improves framerate though. It reduces the number of pixels the GPU has to render in the first place. It probably doesn't actually have a significant impact on VRAM usage. And that image, again, doesn't tell us that 10 GB of VRAM is insufficient for Fortnite at 4K max settings + RT. It just tells us that it's really difficult to render.
I wouldn't bother with that one if I were you. It's like to talking to a really dumb brick wall that loves AMD.
 

·
sudo apt install sl
Joined
·
7,305 Posts
But that's not why DLSS improves framerate though. It reduces the number of pixels the GPU has to render in the first place. It probably doesn't actually have a significant impact on VRAM usage. And that image, again, doesn't tell us that 10 GB of VRAM is insufficient for Fortnite at 4K max settings + RT. It just tells us that it's really difficult to render.
Not agreeing with that troll on the purpose of DLSS.

DLSS does reduce VRAM usage by a good chunk at 4K since it's constructing the image in real time. I've noticed VRAM drops between 1-3GB depending on the game.
 

·
Registered
Joined
·
784 Posts
Did anyone notice how many spatulas LJM has in his kitchen?
Referring to the official launch event video.

I knew buying this 2080ti ~ 6 months ago that new stuff was around the corner, but I just couldn't wait anymore. 980ti SLI was holding me back at 4k60 and 3440x1440 100.
First time having single GPU on my main system since...Fermi. Its awesome. I do not miss mGPU one bit. Except the fact it does look cool in the system.
 

·
The Factory of the Cell
Joined
·
88 Posts

·
I Love this Hobby!
Joined
·
7,786 Posts
If this is true, 3070 owners with 8GB RAM will feel a gut punch. Also, this would also mean that the 3080 owners will get a beating, not only because of the 3070Ti but because a 3080Ti would be imminent with at least 16 GB RAM if not more. Really, who would buy the first models knowing all this? Nvidia is really causing a big mess where a 3070Ti is more desirable then a 3080. It should not work that way.
 

·
Graphics Junkie
Joined
·
2,468 Posts
If this is true, 3070 owners with 8GB RAM will feel a gut punch. Also, this would also mean that the 3080 owners will get a beating, not only because of the 3070Ti but because a 3080Ti would be imminent with at least 16 GB RAM if not more. Really, who would buy the first models knowing all this? Nvidia is really causing a big mess where a 3070Ti is more desirable then a 3080. It should not work that way.
Yea, a big mess indeed.
 

·
Registered
Joined
·
293 Posts
https://www.youtube.com/watch?v=agcbwgrLhOg&feature=emb_logo

Roughly 10% difference witha a 2080ti OC. But it's a fair comparison when the 3080 hovers around 80C.
This is kind of a big deal IMO. It makes the 3080 look barely better than a 2080Ti OC. It now depends on how well the 3080 can OC. If it can't, there will be a lot of people feeling sick about slanging their 2080Ti's on Ebay for $400. Between the OC revelation and the paltry 10GB frame buffer, I am definitely not buying on release day. I'm waiting for reviews, and if I buy, it will be a higher Vram model. Making 1080Ti people choose between a Vram downgrade and a $1500 card is pretty typical of Nvidia at this point I must say. At first, I was excited about this launch, but now I'm just disappointed and a little pissed. I don't want to wait, but there's no choice now.
 

·
mfw
Joined
·
8,621 Posts
141 - 160 of 162 Posts
Top