Try some more robust and higher level benchmarking. I love LTT but he can be silly and biased sometimes.
- Gamer's Nexus
- Digital Foundry
Yeah no, the hierarchy goes like this:Try some more robust and higher level benchmarking. I love LTT but he can be silly and biased sometimes.
- Gamer's Nexus
- Digital Foundry
Only watching the reviews that confirm your pre-existing bias is THE EXACT DEFINITION of confirmation bias.I'm not going to watch anything from Digital Foundry, hopefully NV compensated them well enough for their paid sponsorship and hyping of Ampere because I no long hold them as a trustworthy source of information.
Indeed. Cherry picking is a keystone of confirmation bias.Only watching the reviews that confirm your pre-existing bias is THE EXACT DEFINITION of confirmation bias.
Oh that's an interesting attempt at a personal attack.Only watching the reviews that confirm your pre-existing bias is THE EXACT DEFINITION of confirmation bias.
Well well well, the 3080 does not clock higher just like I stated, and those leaked benches? Yeah those were legit, making the 3080 only 25% faster than the 2080 Ti in rasterization.
Scientists don't set out to prove a hypothesis, they set out to disprove it. If they (and several other scientists) fail to disprove it, it's accepted as a theory.Oh that's an interesting attempt at a personal attack.
Oh wait, so when scientists have a hypothesis and they set out to test said hypothesis and wind up proving it, are they castigated by their colleagues for having "confirmation bias"?
Roger that.Scientists don't set out to prove a hypothesis, they set out to disprove it. If they (and several other scientists) fail to disprove it, it's accepted as a theory.
Scientists don't set out to prove a hypothesis, they set out to disprove it. If they (and several other scientists) fail to disprove it, it's accepted as a theory.
But it is 70-90% faster than a 2080 non-TI and non-Super. I believe the DF benches were in 4k, seems to lineup with what others are getting as well.Proof? NV gave them the exclusive ability to post carefully curated benchmarks and a pseudo-review showing the 3080 some 70-90% faster than the 2080.
It's not an attempt at anything. I'm just stating what is observable. You refuse to watch reviews from publications that might have a different perspective from Jim's, claiming they're shills, even though their histories are much more immaculate than AdoredTV's.Oh that's an interesting attempt at a personal attack.
Read your post wrong a bit.+1
It's kind of ironic that these cards will be effectively what AMD have been criticized - and rightly so - over the past few generations.
Power Hungry, hot, limited OC, and clocked aggressively out of the box to squeeze every bit of performance out.
NVidia are the new AMD?
But +1 to your post as you're absolutely spot on.
Gamer Nexus showed the 3080 was around 24-26% faster at 4k than a 2080 Ti stock.On average, the 3080 is 31.7% faster in 4K gaming compared to the 2080 Ti according to TechPowerUp, and several games are closer to 35% than 30%NVIDIA's new GeForce RTX 3080 "Ampere" Founders Edition is a truly impressive graphics card. It not only looks fantastic, performance is also better than even the RTX 2080 Ti. In our RTX 3080 Founders Edition review, we're also taking a close look at the new cooler, which runs quietly without...www.techpowerup.com
For the G2 there will probably be a 40% bump up and over 2080 Ti watt for watt considering that although the 3080 is on average 30% faster at 4K this disparity decreases by about 10-15% when you run 2080 Ti at the same power draw (see Gamers Nexus link in previous comment). 3090 is the same chip with roughly 20% more SM's / CUDA cores and rough math puts it around potentially being 20% faster than the 3080.MowTin said:This is basically my inner dialogue. I keep oscillating between..I should get a 3080 and I should get a 3090 and I should just chill out and wait for the G2 and VR benchmarks.vulcan1978 said:I found Hardware Unboxed to have the most comprehensive review, and that's after watching GamersNexus and LTT's reviews. They showed that the performance difference between 3080 and 2080 Ti is only 20% at 1440p (30% at 4K) and they showed why this is the case (27:28 mark, at 4K the portion of the render time per frame is heavier on FP32 shaders than on lower resolutions). They also showed that 8GB of video memory is . Also, EKWB has Strix 3080 / 3090 blocks and back-plates up for pre-order. Can anyone confirm whether or not the Strix will be priced at $1800? Considering I am currently at 3440x1440, which is closer to 2560x1440 than 3840x2160 (25% and 67% difference respectively) and considering that the 3080 is only than 2080 Ti when both cards are at the same power draw of 330w. That means I would be looking at maybe a 23% uplift at 3440x1440 with overclocked 2080 Ti at the same power draw? (I believe the 10-15% cited by Steve means that there was a 10% difference at 1440p and a 15% difference at 4K running the 2080 Ti overclocked @ the same power draw of 330w but I could be mistaken). This means the 3080 is not a viable upgrade path for those of us with 2080 Ti and if the estimated 20% performance uplift of the 3090 is accurate that means that the 3090 may only be ~35% faster at 1440p vs 2080 Ti. For $1500. The only way that this would make any sense economically is that I also have pre-ordered HP's Reverb G2 whose resolution would see something more like a ~45-50% increase in performance (2160x2160x2) but even then it's still a tough pill to swallow. For anyone not at 4K with a 2080 Ti I would advise them to save their money. Basically overclocked 2080 Ti is roughly 10% slower than the 3080 at 1440p at the same power draw. This is Turing all over again, might as well replace the 2080 and 2080 Ti with the 3080 and 3090 if you are a 2080 Ti owner.This is what a 3090 upgrade looks like it's going to cost me if I want a card with over 375w and a water-block available quickly:$1800
+$230=$2030, add 8% sales tax for brings that up to $2200, add shipping, brings that up to around $2240 or so.So $2240 for a 35% bump in frame-rate on my 3440x1440 ultrawide before the G2 arrives.Yeah, no thanks. I think I'm going to sit this one out. $900 for the 2080 Ti XC2 + waterblock was a lot of money for a ~50% increase up and over 1080 Ti.Another $2200 for another 35-50%, this is insane. And bear in mind, this is the only upgrade path if you have a 2080 Ti.How do people cheer-lead for this? Related:
I'm looking forward to MSFS 2020 in VR on the G2. But if it's CPU bottlenecked then it's not worth buying the 3090.
I would also like to see Assetto Corsa Competizione benchmarks in VR on the 3090.
Yes that game should NOT ever be used as a GPU benchmark.Sultan.of.swing said:Guy's I would recommend against using MSFS as your basis for deciding which card to get.
The sim is Highly CPU bound and very unoptimized.