Overclock.net banner

41 - 60 of 364 Posts

·
I <3 narcissists
Joined
·
6,174 Posts
3090 reviews next week?
Will it be that soon? The dont go on sale for months so I assumed it would be months before we see the benchmarks.
 

·
Registered
Joined
·
382 Posts
I get the CPU bottleneck side, but this is different. 3080 need more load for the CUDA cores which only comes at 4K. This looks similar to Vega.
That doesn't make any sense to me. Something has to be bottlenecking it at resolutions lower then 4k, and if its not CUDA cores or CPU, what is it then ?
 

·
Registered
Joined
·
59 Posts
The two worries about getting the 3080 for 4K gaming are a new card coming out later with more VRAM and an AMD card matching it in performance.

20GB of DDR6x is likely very expensive so the only card with more VRAM at this price could be 16GB DDR6

The 3080 looks like 2 x 5700. A new competitor card could be made faster by more processing requiring more power and better perf/w

A new card matching 3080 power draw it might be 1.7 x 5700
and in addition it might be 1.5x perf/W

So the new AMD card could be 2.5 x 5700 at 3080 power draw
But is it really likely that AMD can make a 2 to 2.5 generation leap? Even Nvidia cant manage that.
 

·
Registered
Joined
·
369 Posts
Read a couple reviews so far. Digital Foundry, Guru3dD, TT and more.

All point to the same conclusion. 25-30% faster at 4k but also 30% more power drawer.

What i disliked in most of the reviews is that they compare it more prominently with the 2080 (not even the super) due to the price of the two. A comparison between TU104 vs GA102... Irrespective of the price the 3080 should be compared with a 2080ti. Same chipset.

Its a good card, and for me coming from a 1080ti will be a great boost, but i had hoped for a bit more umf from the card tbh. We will have to see what the partners cards are like, when the 20gb model is out and what Navi will bring.
This card IS a replacement for 2080 and 2080 Super. Putting it against a 2080 ti is just a cool comparison, but they are not in the same ballpark price wise and tier wise.
 

·
Vandelay Industries
Joined
·
1,924 Posts
This card IS a replacement for 2080 and 2080 Super. Putting it against a 2080 ti is just a cool comparison, but they are not in the same ballpark price wise and tier wise.
Nvidia got clever with stack shifting and last gens pricing. That muddied the water and cases can be made for both comparisons.
 

·
Registered
Joined
·
762 Posts
Far cry from Nvidia's performance marketing claims of 2x 2080 though outside of a couple of best case scenario games like Doom Eternal.

Also it seems the architecture definitely is working to boost the performance at 4k, where 1440p is not seeing as big of gains. Lots of power draw and I saw one review video (hardware unboxed) complaining of coil whine.

Will be curious to see where the 3070 will end up.
Nvidia replied to Linus Tech Tips to specify that the "twice as fast as RTX 2080" was specifically "only for Quake 2 RTX and Minecraft RTX." So again, truthful, but very misleading (what a shocker)
 
  • Rep+
Reactions: Hueristic

·
Registered
Joined
·
762 Posts
That doesn't make any sense to me. Something has to be bottlenecking it at resolutions lower then 4k, and if its not CUDA cores or CPU, what is it then ?
Hardware Unboxed spoke a little about this. To sum it up:

"The reason is mostly down to the Ampere Architecture and the change to the SM configuration......the 2x FP32 design can only be fully utilized at 4K and beyond. At 4K+ resolution there's a higher portion of the render time per frame calculating FP32 shaders. At lower resolutions, the triangle/vertices calculations are identical to 4K., while @ higher resolutions pixel/compute shader calculations are more intensive, and can more-greatly fill the 2x FP32 pipeline in the Ampere SMs"

Here's a link to their review, @ that timestamp:
 
  • Rep+
Reactions: Hueristic

·
Registered
Joined
·
3,845 Posts
so for the looks of it the 3090 can be just 20-30% (thats been generous faster than the 3080) and you just paying for the extra uneeded vram bcuz 24gb like seriously..

So the 3090 would become whats the new 2080 ti price perf

and looking already is a 102 chipset i dont expect to much magic sauce maybe a card to fit in between the 3080 / 3090 probably 1000-1200 which still would be look at the same way we look at the 2080ti nvidia never learn XD
I have little doubt that they will release a 3080ti for around $800 with almost as many CUDA cores as the 3090 (and will clock higher negating the few less cores) with ~16gb of VRAM in a few months.

Buying a 3090 now is simply paying a premium to get 3080ti level performance 6 months early.
 

·
Registered
Joined
·
6,178 Posts
I have little doubt that they will release a 3080ti for around $800 with almost as many CUDA cores as the 3090 (and will clock higher negating the few less cores) with ~16gb of VRAM in a few months.

Buying a 3090 now is simply paying a premium to get 3080ti level performance 6 months early.
$800, yeah right. Why would they do that when the 2080ti was selling for $1200. With the clamor of the 3090 being the 3080ti without confirmation from Nvidia, Nvidia has leeway to price the 3080ti wherever it wants, likely $1200. 3080 20GB is more likely to be $800.
 

·
WaterCooler
Joined
·
3,445 Posts
Nvidia was slotted to use Samsung and TSMC this arch, right? I'm starting to wonder if they aren't planning on rolling the TI's out on TSMC's 7nm considering the lack of O/C headroom on the Samsung node.
Doubt it. Can't just take the 8nm (which is a refinement of 10nm) process design and just slap it into TSMC 7nm.

Nvidia replied to Linus Tech Tips to specify that the "twice as fast as RTX 2080" was specifically "only for Quake 2 RTX and Minecraft RTX." So again, truthful, but very misleading (what a shocker)
Guess that is a clarification, but certainly not how they presented it in the announcement video, hence the post-announcement hype. The marketing worked.
 

·
Jedi Knight
Joined
·
661 Posts
Guess that is a clarification, but certainly not how they presented it in the announcement video, hence the post-announcement hype. The marketing worked.
??

2459108


UP TO

For as long as we've being doing this same song and dance with marketing in general, being misled by this is like me asking you to pull my finger and you're shocked when I make a fart noise...
 

·
Registered
Joined
·
3,304 Posts
Curious as to why they killed off NVLink completely on the 3080. That indicates that the new drivers most likely will not have CFR which is quite sad.

A 3090 paired with the 10900K OC'd would be a real beast. Can't fit 2x 3-slot cards in the Z490 rig so a single 3090 it is for that one. meh... :rolleyes:
 

·
WaterCooler
Joined
·
3,445 Posts
??

View attachment 2459108

UP TO

For as long as we've being doing this same song and dance with marketing in general, being misled by this is like me asking you to pull my finger and you're shocked when I make a fart noise...
Yeah ok, that's fair, but one watching might feel the implication here is that the up to might be the rule not the exception. As in you'll get close to that, but maybe not meet it.

At best this seems to be true only in a couple of games with RTX/DLSS.
 

·
Registered
Joined
·
1,657 Posts
AMD can definitely compete with this. It isn't out of reach and is only about 30% faster than 2080 Ti reference at 4k. There is still hope folks.
The 3090 will probably be 10-15% faster than this card, making it 45-50% faster than 2080 Ti.
When is AMD supposed to release their stuff?

I honestly don't see AMD competing with Nvidia is gen. Maybe they will get close to 3080 in traditional rasterization performance but RT and DLSS make nvidia a clear winner.
 

·
Registered
Joined
·
2,369 Posts
When is AMD supposed to release their stuff?

I honestly don't see AMD competing with Nvidia is gen. Maybe they will get close to 3080 in traditional rasterization performance but RT and DLSS make nvidia a clear winner.
You've seen the AMD software stack before release? Tell us!
 

·
Always Learning
Joined
·
1,160 Posts
I have little doubt that they will release a 3080ti for around $800 with almost as many CUDA cores as the 3090 (and will clock higher negating the few less cores) with ~16gb of VRAM in a few months.

Buying a 3090 now is simply paying a premium to get 3080ti level performance 6 months early.
$800, yeah right. Why would they do that when the 2080ti was selling for $1200. With the clamor of the 3090 being the 3080ti without confirmation from Nvidia, Nvidia has leeway to price the 3080ti wherever it wants, likely $1200. 3080 20GB is more likely to be $800.
Does anyone know what's more expensive?

10GB of GDDR6x vs. 16GB of GDDR?
 

·
Rig Advisor
Joined
·
1,938 Posts
Can't fit 2x 3-slot cards in the Z490 rig so a single 3090 it is for that one.
You can if you put them under water.
 
  • Rep+
Reactions: SwitchFX

·
Registered
Joined
·
846 Posts
Overall I'm disappointed with the 3080 which I'd preemptively chosen to be my GPU in my coming build. Like most people I'd already realized NVidia's tests during Jen Hsun's video stream a few weeks ago was based on cherry picked data. I didn't buy into the hype that the fan and heatsink design was terrible. It's been done before, albeit with slight changes. It works. I was expecting more performance and not so much variance given up NVidia were hyping this card up for over a month. In tests of a wide variety of game genres displaying different game engines, the swing is impressive. The power usage is not. I'm curious just how much NVidia can crank these cards up when it comes to a future performance variant release, like a 3080 or 3090 Ti. We might see the inverse of Pascal here when it comes to refreshed or later cards using a better fabrication node.

To me, it feels as if NVidia went balls out to pump these cards with as much juice and send clocks soaring. We'll see the whole picture in the coming weeks and months as the entire series goes up for sale. Hopefully the 3090 turns out to be better in energy use, thermals and performance. I'm sure AMD has something wild in the works given NVidia's release today which seems half-assed in areas, not to mention the justification of using more juice and a larger thermal interface. I do not expect AIB cards to run as cool as FE cards.

AMD may have some winner that forced NVidia's hand, but the reality is that AMD's software needs a lifetime of buffing to get it up to par. Using AMD's software is akin to dragging your tongue across a garbage bin festering with rotten food.
 
41 - 60 of 364 Posts
Top