Overclock.net banner

21 - 40 of 123 Posts

·
Vandelay Industries
Joined
·
1,924 Posts
Can we get a 3080 review thread (various sources) instead of a thread titled as a question to whether there are reviews today?

That aside......Kinda expected performance gains. The genius move by Nvidia here is last gens pricing. Using that as a point of comparison the p/p looks fantastic. Take away last gens absurd pricing and....the p/p looks more in line with what we are used to.

I think the most impressive thing Nvidia has done lately is not this arch but DLSS 2.0. That has me more excited than the 3000 series.
 

·
waifu for lifu
Joined
·
11,082 Posts

·
mfw
Joined
·
8,621 Posts
Good job buddy, you deserve a gold star!

Even though there is nothing exponential about it, yes obviously this card aims to a good solution for 4K. In other news, water is wet.
If you knew how to interpret graphs, you'd see a distinct pattern whereby the 3080 is virtually as performant as a 2080ti at 1080p and considerably better at 4K. The performance increase is not even close to linear across resolutions, which is a direct result of the way the new architecture was designed.

But, sure, keep being the excellent person you've always been. Everyone here loves you.

The ironic part is that you just tried to insult the only user here who read your post.
 

·
waifu for lifu
Joined
·
11,082 Posts

TL;DR: PCI-E 3.0 will not bottleneck a 3080.
 

·
Registered
Joined
·
72 Posts
Discussion Starter #26 (Edited)
Gamers Nexus suggests the FE version is power limited. Strix with the 3 8 pin connectors solve that issue if their implementation is useful? I can't find any more info on the Strix.
 

·
Covfefe
Joined
·
2,110 Posts
Made me laugh - PCIE 3.0 is a couple of frames slower than PCIE 4.0......

But 90% of the reviews are on AMD 3950X, which is 10% slower than the 10900K in all the games tested.
 

·
PCMR
Joined
·
1,489 Posts
I'm bit underwhelmed with 4k results, not much of an upgrade from 2080Ti tbh unless DLSS is implemented properly and this is only available in few games.
I think they over hyped this quite a bit, people were jumping on Nvidia love train forgetting that Nvidia didn't do anything special.
All they did is sell next gen card that is faster at roughly same price as last generation, sure it gives you bigger performance uplift than last gen but why all the hype, that's what is supposed to happen.
I'm going to wait for RTX3090 to see how fast this will be because 3080, not that impressive.
Also this highlights Nvidia hype marketing, pick few selected games with RTX and DLSS to tout massive improvements while in reality vast majority of games will see around +10 FPS 2080Ti to 3080
This strengthens my previous suspicion that RTX3070 will not really be faster than 2080Ti unless in RTX/DLSS games.
 

·
Registered
Joined
·
2,369 Posts
Made me laugh - PCIE 3.0 is a couple of frames slower than PCIE 4.0......

But 90% of the reviews are on AMD 3950X, which is 10% slower than the 10900K in all the games tested.
AMD 3950X isn't 10% slower than 10900K in all games tested.

According to Techspot (Hardware Unboxed), across 14 games the 10900K is 5.8% faster in average frame rates and 4.4% faster at the 1% lows at 1440p and 0% faster at 4K in both average and 1% low framerates.

They also mention that the margin is even smaller with the 2080 Ti: "Looking at how the 3950X and 10900K compare with the RTX 3080 and 2080 Ti at 1440p across our 14 game sample, the 10900K was 4% faster with the 2080 Ti."

Source: Nvidia GeForce RTX 3080 Review
 

·
waifu for lifu
Joined
·
11,082 Posts
I'm bit underwhelmed with 4k results, not much of an upgrade from 2080Ti tbh unless DLSS is implemented properly and this is only available in few games.
I think they over hyped this quite a bit, people were jumping on Nvidia love train forgetting that Nvidia didn't do anything special.
All they did is sell next gen card that is faster at roughly same price as last generation, sure it gives you bigger performance uplift than last gen but why all the hype, that's what is supposed to happen.
I'm going to wait for RTX3090 to see how fast this will be because 3080, not that impressive.
Also this highlights Nvidia hype marketing, pick few selected games with RTX and DLSS to tout massive improvements while in reality vast majority of games will see around +10 FPS 2080Ti to 3080
This strengthens my previous suspicion that RTX3070 will not really be faster than 2080Ti unless in RTX/DLSS games.
The "50% faster than 2080ti" remark will be just as misleading in regards to the 3090 performance. I suspect the possible new Ti will have both memory speed & core bumps. I dont see it matching the 24GB of memory though. If Nvidia wants current Ti owners to upgrade, the 3080 aint enough.
 

·
Newb to Overclock.net
Joined
·
4,123 Posts
The "50% faster than 2080ti" remark will be just as misleading in regards to the 3090 performance. I suspect the possible new Ti will have both memory speed & core bumps. I dont see it matching the 24GB of memory though. If Nvidia wants current Ti owners to upgrade, the 3080 aint enough.
Jensen was pretty clear that he wanted Pascal Ti to upgrade. The non-mention of Turing Ti was definitely intentional, as they likely understood it to be a hard battle not worth the fight.
 

·
Overclocking Enthusiast
Joined
·
5,959 Posts
Lovely space heaters.
 

·
Registered
Joined
·
1,154 Posts
My take on it is 20-30% increase over 2080ti for ~30% more power draw. The new node allowed them this.

Like i posted in another thread, its kinda missleading comparing it to the 2080. One ise 104 gpu the other is a 102. Its a direct upgrade to the ti.

I'll wait until Navi to upgrade from my 1080ti but i'm still going for the 3080 unless navi is tempting. I'd rather stay nvidia if all things are equal.
 

·
The Factory of the Cell
Joined
·
88 Posts
I think for the time being I'm keeping my 2080 Ti. Not convinced that the 3080 is a significant enough upgrade.
 
  • Rep+
Reactions: RobotDevil666

·
Graphics Junkie
Joined
·
2,468 Posts
So, the obvious takeaway here is that these cards' performance increase scales exponentially with pixel count. Clearly, they wanted to be ready for the 4K revolution.
Exponential means the rate of increase goes up as in increases. No graphics card will ever have exponential performance growth bud, that's why performance graphs don't look like parabolas.

Also seems unlikely that you would have a tally on how many people read my post. Seems a lot more likely that this is just another example of your super friendly demeanor.
 

·
Overclocker
Joined
·
11,339 Posts
As expected, overhyped and fairly disappointing so far. The only reason some people seem to "like it" is because they got used to the crazy pricing of GPUs with Pascal and Turing even more.

There are some definite good things about the Ampere GA102 and the cooler design as expected. Performance gains... pretty standard mediocre/average stuff. Efficiency wise... value wise... power wise... nothing spectacular, double the power, double the performance, so far the efficiency over Turing looks to be improved very little. Some reviewers got magical 66 C under full load while other are cooking 78 C and neither not that quiet fan speeds. Will see after more reviews if the cards can actually hold under 70 C inside a case or will be the usual hot cooker of close to 80 C, which with the 320+ W is what I would expect even from a push through cooler design.

The power constraints/limits same as previous recent generations look to be serious. Unlimited power these Ampere cards are bound to be a power hog with some gain in performance but the efficiency goes to hell and even stability may go out the window. Turing hates it already once all parts of the GPU are used, RT+DLSS, power density starts to be a limit.

Availability right now? Absolute zero. Not from shops, not from NV here. Tomorrow, maybe, for how long? Who knows, 10 seconds? 10 minutes? 10 hours? Days? RTX 3080 for $839... no thanks, pass. Great chip, nice card, good performance, too small improvement, very high price.

For all the 4k, gotta have my 4k people, sure go get it if you didn't waste twice that on a 2080Ti already.

Can't wait for their break up now that they kill off/swallow ARM on top of the endless other companies they did so far.

---
@ribosome I don't think you ever needed to worry with a 2080Ti. The initial marketing is always a best case show off and reviews then paint a little more realistic picture.
 

·
Newb to Overclock.net
Joined
·
4,123 Posts
My take on it is 20-30% increase over 2080ti for ~30% more power draw. The new node allowed them this.

Like i posted in another thread, its kinda missleading comparing it to the 2080. One ise 104 gpu the other is a 102. Its a direct upgrade to the ti.

I'll wait until Navi to upgrade from my 1080ti but i'm still going for the 3080 unless navi is tempting. I'd rather stay nvidia if all things are equal.
So many people are stuck in the price and performance bubbles, as NVidia and AMD would like them to be. An enthusiast would always appreciate the true technical struggle of either producer. I've said the same thing as you are now, NVidia is using the pricing to mask the technical uplift (cheat) of a 256-bit SKU to 320-bit in order to compete with its own Turing (and potentially upcoming RDNA2) in this bracket. They even had to up the TDP in order to accomplish the performance gains they needed. There is a smell here, especially when NVidia was able to get away with a smaller 256-bit SKU for several generations due to lack of competition.

As much as I would like the 3080 as an upgrade for WQHD gains, even my 1080 Ti at 300-330W already dumps a lot of heat in my every efficient water loop. 1-2 months might not be too much of a wait, if RDNA2 turns out to be much more efficient.
 

·
mfw
Joined
·
8,621 Posts
21 - 40 of 123 Posts
Top