Overclock.net banner

61 - 80 of 96 Posts

·
Registered
Joined
·
11 Posts
Try some more robust and higher level benchmarking. I love LTT but he can be silly and biased sometimes.

- Gamer's Nexus

- Digital Foundry
 

·
Registered
Joined
·
59 Posts
I like that channel but nothing Nv does is dumb - they're very clever people

But its possible to overthink a simple hardware purchase. If youre in the market for a GPU upgrade to use at 4K then this card clearly provides those fps regardless of the tech used to create the card - its 70-100% over my 1080ti

I'm ITX and wish the power consumption had gone down and the GPU memory had gone up - but for now this is the best card actually available - if youre seriously in the market for one
 

·
Registered
Joined
·
830 Posts
Discussion Starter #63
Try some more robust and higher level benchmarking. I love LTT but he can be silly and biased sometimes.

- Gamer's Nexus

- Digital Foundry
Yeah no, the hierarchy goes like this:

LTT > GN with DF, clearly a bought and paid for NV propaganda outlet (hence the exclusive ability to provide carefully curated benchmarks and content 2 weeks ago from NV) way down at the bottom tied with Tom's Hardware and their "Just buy it" Turing fiasco.

YOU might not like what Linus has to say this morning, shattering your misperception, but trying to lump LTT in with MLID (I can't even say AdoredTV because Jim is actually fairly good at analysis, speculation is always a gamble irrespective of whose doing it though) because you don't like the fact that the 3080 isn't as fast as you thought it would be, or that it runs 100w on avg more than 2080 Ti. Yeah no, time to look in the mirror if you want to start talking about "bias".
 

·
Registered
Joined
·
830 Posts
Discussion Starter #64
I'm not going to watch anything from Digital Foundry, hopefully NV compensated them well enough for their paid sponsorship and hyping of Ampere because I no long hold them as a trustworthy source of information.
 

·
mfw
Joined
·
8,621 Posts
I'm not going to watch anything from Digital Foundry, hopefully NV compensated them well enough for their paid sponsorship and hyping of Ampere because I no long hold them as a trustworthy source of information.
Only watching the reviews that confirm your pre-existing bias is THE EXACT DEFINITION of confirmation bias.
 
  • Rep+
Reactions: Intoxicus

·
Registered
Joined
·
11 Posts
Only watching the reviews that confirm your pre-existing bias is THE EXACT DEFINITION of confirmation bias.
Indeed. Cherry picking is a keystone of confirmation bias.
When the reviewers that burst his bubble get mentioned he attempts to dismiss their well established veracity.
 

·
Registered
Joined
·
830 Posts
Discussion Starter #67
Only watching the reviews that confirm your pre-existing bias is THE EXACT DEFINITION of confirmation bias.
Oh that's an interesting attempt at a personal attack.

Oh wait, so when scientists have a hypothesis and they set out to test said hypothesis and wind up proving it, are they castigated by their colleagues for having "confirmation bias"?
 

·
Registered
Joined
·
1,522 Posts
Well well well, the 3080 does not clock higher just like I stated, and those leaked benches? Yeah those were legit, making the 3080 only 25% faster than the 2080 Ti in rasterization.
On average, the 3080 is 31.7% faster in 4K gaming compared to the 2080 Ti according to TechPowerUp, and several games are closer to 35% than 30%
 

·
Registered
Joined
·
1,522 Posts
Oh that's an interesting attempt at a personal attack.

Oh wait, so when scientists have a hypothesis and they set out to test said hypothesis and wind up proving it, are they castigated by their colleagues for having "confirmation bias"?
Scientists don't set out to prove a hypothesis, they set out to disprove it. If they (and several other scientists) fail to disprove it, it's accepted as a theory.
 

·
Registered
Joined
·
830 Posts
Discussion Starter #70
"With that 330W on the 2080 Ti OC the 3080 @ 330w was 10-15% ahead" - Steve @ GN


Basically what I stated here yesterday, that overclocked 3080 would only be ~15% faster than overclocked 2080 Ti.
 

·
Registered
Joined
·
11 Posts
Scientists don't set out to prove a hypothesis, they set out to disprove it. If they (and several other scientists) fail to disprove it, it's accepted as a theory.
Roger that.

Repeatable results are also highly important. If one reviewer is not congruent with the others that makes that reviewers suspect, not the other way around.

Gamers Nexus has the most robust methodology, and explains their methodology. Which is largely why they are regarded as the most trusted reviewer on these things.

GN has made jokes and mentions many times that whenever they speak favorably or unfavorably of a brand, or disagree with a commonly held opinion, they get a lot of false accusations from people on social media of being bought. People get stuck on their bias and don't want to budge from it. It is fascinating to observe.

Also as a side note: Pointing out bias is NOT an ad hominem fallacy. Saying Digital Foundry is bought by Nvidia without proof of such is an ad hominem for example.
 

·
Registered
Joined
·
11 Posts
Scientists don't set out to prove a hypothesis, they set out to disprove it. If they (and several other scientists) fail to disprove it, it's accepted as a theory.

There is a metaphor about confirmation bias called the "Law of Fives." It states that you can derive the number 5 out of literally anything if you try hard enough. And that's the point, the more you look the more you find it. For example 8 is 2^3, 2+3 == 5. 4 is 5-1, so another 5 found. 16 is 1-6 ==-5 * -1 == 5. And I could go on forever and get increasingly convoluted about finding a 5 in anything.

If someone wants the 3000 series to not be that good they can find all sorts of reasons to believe that.
If someone wants to find a reason to justify a previous purchase they only need to put the effort in to convince themselves.

"Humans are not rational animals. Humans are rationalizing animals." -Robert A Heinlein
 

·
Registered
Joined
·
690 Posts
Proof? NV gave them the exclusive ability to post carefully curated benchmarks and a pseudo-review showing the 3080 some 70-90% faster than the 2080.
But it is 70-90% faster than a 2080 non-TI and non-Super. I believe the DF benches were in 4k, seems to lineup with what others are getting as well.
 

·
mfw
Joined
·
8,621 Posts
Oh that's an interesting attempt at a personal attack.
It's not an attempt at anything. I'm just stating what is observable. You refuse to watch reviews from publications that might have a different perspective from Jim's, claiming they're shills, even though their histories are much more immaculate than AdoredTV's.

Nice blog, though.
 
  • Rep+
Reactions: Intoxicus

·
Performance is the bible
Joined
·
7,043 Posts
+1

It's kind of ironic that these cards will be effectively what AMD have been criticized - and rightly so - over the past few generations.

Power Hungry, hot, limited OC, and clocked aggressively out of the box to squeeze every bit of performance out.

NVidia are the new AMD?

I'm confused.

But +1 to your post as you're absolutely spot on.
Read your post wrong a bit.

But, we don't know yet.
Tuning had limited OCing as well, but people eventually knew how to get more off them.
So far most don't know how to OC them well. We will see new versions of OCing tools as well as better drivers, and more people getting into it. And see how AIB cards come as.
 

·
Registered
Joined
·
386 Posts
On average, the 3080 is 31.7% faster in 4K gaming compared to the 2080 Ti according to TechPowerUp, and several games are closer to 35% than 30%
Gamer Nexus showed the 3080 was around 24-26% faster at 4k than a 2080 Ti stock.

Then factoring in power consumption and the fact the 2080 Ti is superior at overclocking, the 3080 doesn't look very impressive to me. Quite the opposite.

Especially when you consider how well most 2080 Tis can overclock vs the <100mhz overclocks many reviewers were able to achieve on their 3080s and the limited gains these overclocks achieved.
 

·
Registered
Joined
·
830 Posts
Discussion Starter #77 (Edited)
Relevant to the discussion and a conclusion of sort, I originally posted this over at:

MowTin said:
vulcan1978 said:
I found Hardware Unboxed to have the most comprehensive review, and that's after watching GamersNexus and LTT's reviews. They showed that the performance difference between 3080 and 2080 Ti is only 20% at 1440p (30% at 4K) and they showed why this is the case (27:28 mark, at 4K the portion of the render time per frame is heavier on FP32 shaders than on lower resolutions). They also showed that 8GB of video memory is . Also, EKWB has Strix 3080 / 3090 blocks and back-plates up for pre-order. Can anyone confirm whether or not the Strix will be priced at $1800? Considering I am currently at 3440x1440, which is closer to 2560x1440 than 3840x2160 (25% and 67% difference respectively) and considering that the 3080 is only than 2080 Ti when both cards are at the same power draw of 330w. That means I would be looking at maybe a 23% uplift at 3440x1440 with overclocked 2080 Ti at the same power draw? (I believe the 10-15% cited by Steve means that there was a 10% difference at 1440p and a 15% difference at 4K running the 2080 Ti overclocked @ the same power draw of 330w but I could be mistaken). This means the 3080 is not a viable upgrade path for those of us with 2080 Ti and if the estimated 20% performance uplift of the 3090 is accurate that means that the 3090 may only be ~35% faster at 1440p vs 2080 Ti. For $1500. The only way that this would make any sense economically is that I also have pre-ordered HP's Reverb G2 whose resolution would see something more like a ~45-50% increase in performance (2160x2160x2) but even then it's still a tough pill to swallow. For anyone not at 4K with a 2080 Ti I would advise them to save their money. Basically overclocked 2080 Ti is roughly 10% slower than the 3080 at 1440p at the same power draw. This is Turing all over again, might as well replace the 2080 and 2080 Ti with the 3080 and 3090 if you are a 2080 Ti owner.This is what a 3090 upgrade looks like it's going to cost me if I want a card with over 375w and a water-block available quickly:$1800
+$230=$2030, add 8% sales tax for brings that up to $2200, add shipping, brings that up to around $2240 or so.So $2240 for a 35% bump in frame-rate on my 3440x1440 ultrawide before the G2 arrives.Yeah, no thanks. I think I'm going to sit this one out. $900 for the 2080 Ti XC2 + waterblock was a lot of money for a ~50% increase up and over 1080 Ti.Another $2200 for another 35-50%, this is insane. And bear in mind, this is the only upgrade path if you have a 2080 Ti.How do people cheer-lead for this? Related:
This is basically my inner dialogue. I keep oscillating between..I should get a 3080 and I should get a 3090 and I should just chill out and wait for the G2 and VR benchmarks.


I'm looking forward to MSFS 2020 in VR on the G2. But if it's CPU bottlenecked then it's not worth buying the 3090.

I would also like to see Assetto Corsa Competizione benchmarks in VR on the 3090.
For the G2 there will probably be a 40% bump up and over 2080 Ti watt for watt considering that although the 3080 is on average 30% faster at 4K this disparity decreases by about 10-15% when you run 2080 Ti at the same power draw (see Gamers Nexus link in previous comment). 3090 is the same chip with roughly 20% more SM's / CUDA cores and rough math puts it around potentially being 20% faster than the 3080.

Problem is that it has a 350W TDP base which only goes up to 375w (FE and reference).

High end SKU AIB's may do more, say 450W or so, but what we've learned with GA-102 is that the 3080 is heavily pre-overclocked from the factory at only 320w out of 370w with maybe another 7% increase in performance attainable at 370w because they clocked GA-102 right at the point where efficiency and performance begins to require a non-linear increase in voltage and power.

It may not matter much if you can keep the card cool under a water block, take TU-102 for example. Allow me to elaborate, with my chip I can get away with 2040-2055 MHz @ ~43C on the core @ 1.025v undervolt, drawing around 330w, going up to 350-373w on occasion. This card can do 2100 MHz, but not with an undervolt on the freq curve (+100mv on the slider in MSI AB). Out of curiosity I stood there against a static background the other day while playing Middle Earth: Shadow of War and it going back and forth between the two OC profiles via hotkey I found that the 2100 MHz core profile required an additional 35w or so (from 311 to 346w) for only 60 Hz more on the core! That means, I'm probably at the very edge of the efficiency threshold for this chip at 2000 MHz and adding more freq requires a non-linear increase in power from here on. Temps also went up by ~3C and although running it at 2100 MHz is good for around 400 points in Timespy GPU (16,700 vs 16,300) it only yielded a 1% gain in performance in Shadow of War.

Anyhow, that's TU-102 at 2100 MHz, a roughly 25% overclock, and some can do 2200 MHz with a little more voltage, or none more at all depending on how good the sample is, making for a 30% overclock (13,600 Timespy GPU to 17,700'ish). 3080 FE can only do like a ~5-7% overclock, even with the fan maxed out and load temps not exceeding 60C @ 370w. Meaning, they've clocked it 20% into it's 30% overclock from the factory because Samsung 8nm EUV is underwhelming compared to 7nm TSMC and they had to to this to make it attractive from a performance perspective for those who skipped Turing or are upgrading from something other than 2080 Ti.

Here, have a look, I will post again, here's 2080 Ti @ 330w compared to 3080 @ 330w:


"3080 @ 324w = 10-15% uplift versus 2080 Ti @ 330w" - Steve from GN

What has me worried is that it's the same story with the 3090.

So if you have a 2080 Ti, just deduct 10% from the performance metrics considering if you run TU-102 at the same power draw it shrinks that 20% gap at 1440p and 30% gap at 4K to ~10 and 20% respectively.

Add a theoretical 20% performance improvement between the 3080 and 3090 and the figures are:

+ ~30% @ 1440p and + ~40% @ 4K

What is crazy is that with a few exceptions Ray Tracing isn't that much faster between the 2080 Ti and the 3080.

My overclocked Port Royal is only 10% shy of the 3080 = https://www.3dmark.com/pr/251502

If you can only really overclock the 3080 by another 7% or so I mean that widens the gap to 20%? Certainly more in certain titles, but realistically no more than 30%.

AdoredTV is right, Nvidia was dumb to go with 8nm EUV, and anyone who buys this mediocrity is also not thinking clearly.

It's not that much faster than a node that is supposedly 50% larger (12nm).

Look at the gains going from Maxwell to Pascal, basically same architecture just dropping the node from 28 to 16nm (~40%).

Overclocked 1080 Ti was easily 50-60% faster than overclocked 980 Ti.

Hell my 2080 Ti is around 40-50% faster than my 1080 Ti and the node only shrunk 33% between Pascal and Turing!

These cards run way too hot and are well into the power-efficiency curve, what you see is what you get, not much more can be extracted from GA-102. We may see 20% overclocks in top tier SKU AIB's under water (Kinpgin @ 2.2 GHz @ 45C) but the days of getting a 30% overclock are over.

Honestly I'm over it, I've decided to completely sit this one out. I'm looking at spending nearly all of my savings for maybe a 30% bump @ 3440x1440 and maybe a 40% bump in VR with the G2.

I'm looking at at least $1700 for a FTW3 and then sitting around and waiting, a year (it took EKWB 1 year to make a water block for the FTW3 after it released) for a WB? Or do I pay $1800 for the Strix and get a block immediately?

I've done the math and Strix + WB works out to $2230 after taxes and $40 worth of shipping.

For 30%?

No thanks.

I will wait. Hell at this point I will just turn settings down and wait 2 years for whatever comes after Fermi 2.0.

You may not like MLID, but I agree with his post release conclusion, Ampere is underwhelming.

And for those saying that NV turned some kind of corner and lowered prices.

No they didn't.

They got those Samsung 8nm EUV wafers for less than half the price of 7nm TSMC. They tried to jive TSMC early in the year and TSMC called their bluff and they were stuck with Samsung. They passed the cost of that gamble onto you, the consumer, in the form of a card that watt for watt is only 10-15% faster than the outgoing 80 Ti card, and has next-to-zero overclocking headroom AND wait until everyone sees how much their electricity bill goes up upgrading to a 370W component from whatever they had. If you had a 2080 your electricity bill just went up 50%. How much do you save over the course of 2-3 years of ownership paying another $20 a month in electricity cost? But boy, that cooler shroud sure is pretty! Don't think about any of this, just focus on the cooler shroud.

Jensen is masterful, he doesn't even need to say "just buy it" over and over again, he just needs to show you an admittedly very well and nice looking cooler and buy a tech-tuber out (Digital Foundry) to create carefully curated "benchmarks" that "show" the 3080 2x as fast as the 2080 and hey, 65% faster than the 2080 Ti and push the NDA back to one day before everything goes on for sale so posts like this get overlooked in the hysteria and people, already committed after seeing the shroud and the carefully curated "benchmarks" from Digital Foundry / Nvidia (I asked Digital Foundry in their comment section if they are legally required to state whether or not Nvidia paid them for their exclusive ability to release content 2 weeks before anyone else and didn't receive a reply).

Just as the only upgrade path for someone with a 1080 Ti in 2018 was a $1200 before taxes 2080 Ti now the only upgrade path is a $1500-1700 3090. And Jim from AdoredTV is correct, the 3090 is not a Titan card, it's 100% the 80 Ti.

Factor in the fact that NV have passed the cost onto you for their costly gambit with TSMC in the form of considerably higher electricity cost (as opposed to the same level of performance on 7nm TSMC, Jim from AdoredTV estimates that a die size of 425mm2 on 7nm TSMC would be as fast as the 628mm2 GA-102 on 8nm EUV, or about 50% more efficient. Say 3080 performance @ 220w) and now factor in the fact that the actual performance gap, watt for watt, between the 2080 Ti and the 3080 is only 10-15%, basically the performance uplift between 1080 Ti and 2080. No-one in their right mind was recommending upgrading from 1080 Ti to the 2080 in 2018 for $700, but now, hey wow, look at that 3080! It's as though we have collective amnesia, we've lost all sense "ooooh, look at that cooler!"

Anyhow, I think I've talked about this enough. That's my opinion, thanks for reading.


Sultan.of.swing said:
Guy's I would recommend against using MSFS as your basis for deciding which card to get.
The sim is Highly CPU bound and very unoptimized.
Yes that game should NOT ever be used as a GPU benchmark.
 

·
Overclocker in training
Joined
·
12,890 Posts
Hi,
Same youtube video on every page just about lol way to show your fetish for GN 🤣
 

·
Registered
Joined
·
3,651 Posts
Titans have always been early sample xx80Ti's, the 3090 is no different.
The 3080Ti running a 3090 chip will come next year as it has every past generation.
 

·
Newb to Overclock.net
Joined
·
4,123 Posts
You sound do confident. :)
 
61 - 80 of 96 Posts
Top