Honestly, the product it self appears to be great. Yes, a card drawing 50% power of a flag ship with similar performance is also great. The issue is simply pricing. For a XX70 series card, it should be cheaper. $699 makes more sense in the context last gen if you factor in high inflation.
People need to forget the naming, kinda.
1920 CCs | G5 | 8GB | 256-bit | 150W | $450 = 1070
2432 CCs | G5 | 8GB | 256-bit | 180W | $450 = 1070 Ti (This card is literally a 1080 but with cheaper memory)
2560 CCs | G5X | 8GB | 256-bit | 180W | $600 = 1080
3584 CCs | G5X | 11GB | 352-bit | 250W | $700 = 1080 Ti
3840 CCs | G5X | 12GB | 384-bit | 250W | $1200 = Titan Xp (This card is basically the GTX 1090)
There is a clear difference between the 1070, 1080 and 1080 Ti (1920 > 2560 > 3584 and 3840 CCs on Titan Xp)
The 1070 Ti is basically a re-release of the 1080, so we can look at it like this:
1070 > 1080 > 1080 Ti > 1090
2304 CCs | G6 | 8GB | 256-bit | 175W | $500 = 2070
2560 CCs | G6 | 8GB | 256-bit | 215W | $500 = 2070 SUPER (This card is literally a 2070 but factory overclocked)
2944 CCs | G6 | 8GB | 256-bit | 215W | $700 = 2080
3072 CCs | G6 | 8GB | 256-bit | 250W | $700 = 2080 SUPER (This card is literally a 2080 but factory overclocked)
4352 CCs | G6 | 11GB | 352-bit | 250W | $1000 = 2080 Ti
4608 CCs | G6 | 24GB | 384-bit | 280W | $2500 = Titan (This card is basically the RTX 2090)
There is a clear difference again, between the 2070, 2080 and 2080 Ti (2304 > 2944 > 4352 and 4608 CCs on Titan)
You can ignore the SUPER models, they are basically the same card but with higher power limit out of box, then we can look at the 20 series like this:
2070 > 2080 > 2080 Ti > 2090
5888 | G6 | 8GB | 256-bit | 220W | $500 = 3070
6144 | G6X | 8GB | 256-bit | 290W | $600 = 3070 Ti (This card just has faster memory and higher overclock)
8704 | G6X | 10GB | 320-bit | 320W | $700 = 3080
8960 | G6X | 12GB | 384-bit | 350W | $800 = 3080 12GB (This is basically 3070 Ti and the 3080 is 3070)
10240 | G6X | 12GB | 384-bit | 350W | $1200 = 3080 Ti
10496 | G6X | 24GB | 384-bit | 350W | $1500 = 3090 (This is basically a Titan)
10752 | G6X | 24GB | 384-bit | 450W | $2000 = 3090 Ti (This is a 3090 but overclocked)
There are the normal jumps, 5888 > 8704 > 10240.
7680 | G6X | 12GB | 192-bit | 285W | $800 = 4070 Ti
9728 | G6X | 16GB | 256-bit | 320W | $1200 = 4080
16384 | G6X | 24GB | 384-bit | 450W | $1600 = 4090
Again jumps but with an exception for the 4090 which is absurd.
Let's try to summarize it,
GTX 1000 Series
1920 CCs for the 1070 | G5 | 8GB | 256-bit | 150W | $450
2432-2560 CCs for the 1070 Ti and 1080 | G5/G5X | 8GB | 256-bit | 180W | $450/600
3584-3840 CCs for the 1080 Ti and Titan | G5X | 11/12GB | 352/384-bit | 250W | $700/1200
Ok so, we have 3 separate GPUs clearly, 1070, 1080 and 1090
1070 = 1920 CCs | 8GB | 256-bit | 150W | $450
1080 = 2432 CCs | 8GB | 256-bit | 180W | $600
1090 = 3584 CCs | 11GB | 352-bit | 250W | $700
What is noteworthy here is how little power the 70 and 80 card consumed, especially the 70, that card can run on potato VRM (extremely cheap, which is why the card itself could be sold for just $450).
RTX 2000 Series
2070 = 2304 CCs | G6 | 8GB | 256-bit | 175W | $500
2080 = 2944 CCs | G6 | 8GB | 256-bit | 215W | $700
2090 = 4352 CCs | G6 | 11GB | 352-bit | 250W | $1000
Again, we clearly have 3 separate GPUs.
Differences:
1070 to 2070 = Faster memory, 25W higher power consumption, $50 higher price
1080 to 2080 = Same memory, 35W higher power consumption, $100 higher price
1090 to 2090 = Same memory, same power consumption, $300 higher price
RTX 3000 Series
5888-6144 CCs for the 3070 and 3070 Ti | G6/G6X | 256-bit | 220/290W | $500/600
8704-8960 CCs for the 3080 and 3080 12GB | G6X | 320/384-bit | 320/350W | $700/800
10240-10752 CCs for the 3080 Ti, 3090 and 3090 Ti | G6X | 384-bit | 350/450W | $1200/1500/2000
As expected.
3070 = 5888 CCs | G6 | 256-bit | 220W | $500
3080 = 8704 CCs | G6X | 320-bit | 320W | $700
3090 = 10240 CCs | G6X | 384-bit | 350W | $1200
Differences:
2070 to 3070 = Slower memory, 45W higher power consumption, same price
2080 to 3080 = Same memory, 105W higher power consumption, same price
2090 to 3090 = Same memory, 100W higher power consumption, $200 higher price
Now we can see that what really changes is the power consumption.
RTX 4000 Series
So, 4070 Ti was originally named 4080 12GB, but it was clearly never a 4080, since it has far fewer CCs, it perfectly matches a 4070, so that is what I will call it here, a 4070.
Differences:
3070 to 4070 = Faster memory, 65W higher power consumption, $300 higher price
3080 to 4080 = Same memory, same power consumption, $500 higher price
3090 to 4090 = Same memory, 100W higher power consumption, $400 higher price
It is definitely a bit disingenuous to compare the 3090 to the 4090, as the 3090 is a "Ti" of the 3080 Ti, what I am comparing is the slowest card of that CC range, we just have to wait for a 4080 Ti with about 15K CCs maybe.
On one hand it makes sense that the cards cost more, because the power keeps increasing significantly, which costs money (PCB/VRM).
Another summarization,
1070/Ti = 25% fewer CCs than the middle child (1080)
2070/Ti = 13% fewer CCs than the middle child (2080)
3070/Ti = 29% fewer CCs than the middle child (3080)
4070/Ti = 21% fewer CCs than the middle child (4080)
1070 = 150W
2070 = 215W (+65W from 1070/Ti)
3070 = 290W (+75W from 2070/Ti)
4070 = 285W (-5W from 3070/Ti)
1070/Ti = 8GB
2070/Ti = 8GB
3070/Ti = 8GB
4070/Ti = 12GB (+4GB)
1070/Ti = $450
2070/Ti = $500 (+$50)
3070/Ti = $600 (+$100)
4070/Ti = $800 (+$200)
1070/Ti is cut down a lot from the xx80, pulls very little power and is cheap
2070/Ti is barely cut down from the xx80, pulls a lot more power but is still cheap
3070/Ti is cut down a lot from the xx80, pulls a lot more power but costs more
4070/Ti is not as much cut down from the xx80, pulls less power and has 50% more VRAM but costs a lot more
3DMark Time Spy Graphics Score
1070/
Ti = 5700
2070/Ti = 10100 (+77% higher performance than 1070/
Ti)
3070/Ti = 14500 (+44% higher performance than 2070/Ti)
4070/Ti = 22500 (+55% higher performance than 3070/Ti)
This is really important to note, for 3070 to be 44% faster than 2070, it had to increase power consumption by 75W, but for the 4070 to be 55% faster than the 3070, it could REDUCE power consumption by 5W, not add another 75W, on top of that it has 4GB larger VRAM, just those two things combined explains why the card costs more, NVIDIA managed to squeeze out more performance than last gen without increasing power consumption, instead reduce it, that's extremely impressive, and it shows, the TUF in the GN review ran 61c avg at 290W with the fans at just 1400RPM, that's a crazy amount of performance for such a low temperature and power consumption.
I would have praised NVIDIA for the 4070 Ti if the price would have been $699, would've been a $100 bump from last gen but 4GB more VRAM and lower power consumption while still delivering 55% higher performance, that's as said, impressive. If you want a cheaper card just wait for the xx60.
xx80 summarization,
1080 = 28% fewer CCs than big brother (1090)
2080 = 32% fewer CCs than big brother (2090)
3080 = 15% fewer CCs than big brother (3090)
4080 = 40% fewer CCs than big brother (4090)
1080 = 180W
2080 = 215W (+35W from 1080)
3080 = 320W (+105W from 2080)
4080 = 320W
1080 = 8GB
2080 = 8GB
3080 = 10GB (+2GB)
4080 = 16GB (+6GB)
1080 = $600
2080 = $700 (+$100)
3080 = $700
4080 = $1200 (+$500)
1080 is cut down a lot from the xx90, pulls very little power and is cheap
2080 is cut down a lot from the xx90, pulls little power but costs more
3080 is barely cut down from the xx90, pulls a lot more power and has 25% more VRAM and costs the same
4080 is cut down significantly from the xx90, pulls the same power and has 60% more VRAM but costs a lot more
3DMark Time Spy Graphics Score
1080 = 7300
2080 = 11000 (+50% higher performance than 1080)
3080 = 17800 (+62% higher performance than 2080)
4080 = 27800 (+56% higher performance than 3080)
Again, NVIDIA managed to squeeze out 56% higher performance without increasing the power consumption, on top of that not only 25% increased VRAM like last generation, but 60%! It makes a lot of sense why the card costs a lot more, but that still doesn't justify what it costs. It also makes sense why 3080 cost the same as the 2080, the 3080 is clearly a "bad card" since the only way they could achieve 62% higher performance was to dramatically increase power consumption by 49% and barely increase VRAM size, the card simply run a lot hotter and louder and partner cards cost more.
3080 was clearly an inferior product to the 4080, so it makes complete sense why the 3080 didn't increase in price and the 4080 did. But who is to say how much that improvement is worth, $500 is a lot more (71%), I would probably have been completely fine with them charging $999, then it'd be $300 more for a much better product, 60% larger VRAM and 56% faster without adding any power/heat/noise, gaming 6 hours a day for 4 years, skipping 1 gen: 880kWh saved, 15 cents per kWh = $132 saved from NVIDIA not increasing power consumption by 100W to achive the 50-62% performance target gen over gen. So let's say the card was $999, and you saved $100+ over the time you kept it, then it'd be $900, so "just" $200 more than previous gen. That sounds fantastic to me, too bad it's not $999 though, $1199 is definitely too high).
xx90 summarization,
1090 = 250W
2090 = 250W
3090 = 350W (+100W higher power consumption)
4090 = 450W (+100W higher power consumption)
1090 = 11GB
2090 = 11GB
3090 = 11GB
4090 = 24GB (There is no Ti yet with less VRAM)
1090 = $700
2090 = $1000 (+$300)
3090 = $1200 (+$200)
4090 = $1600 (There is no Ti yet that shaves off a few hundred)
1090 = 9500
2090 = 13600 (+43% higher performance than the 1090)
3090 = 20100 (+48% higher performance than the 2090)
4090 = 29200 (There is no Ti yet, also CPU bottlenecking this card)
Can't really comment on the 4090 since there is no cheaper alternative in the 4090 family (a cut down 4090 called 4080 Ti).
The RTX 2080 Ti can be commented on though, same power consumption and 43% higher performance is very impressive and it cost $300 more (+43%). Same situation as now, NVIDIA didn't have to increase power consumption to get a near 50% performance leap gen over gen. 3090 (3080 Ti) can definitely be seen as a failed card, it got a little more performance than the gen before it but had to add 100W power consumption to do it, which also increased the price, costs to beef up the VRM to handle that TDP.
Conclusion is that people should have been far more pissed at the RTX 30 series costs, as the cards only reached those performance targets by increasing power/heat/noise, we had to "pay" for that. This time when we pay more we at least get something for it, no additional power/heat/noise but the same performance target increase. I'm absolutely fine with them increasing prices but just not this much, $699 for the 4070 Ti, $999 for the 4080 and $1249 for the 4080 Ti and $1499 for the 4090 sounds reasonable.. maybe.
Either way no one can complain about the 4070 Ti prices right now, as they cost new what a used 3090 Ti costs. In a few weeks the 3090/3090 Ti prices should have gone down a fair bit and by that point the 4070 Ti won't be as appealing anymore. But for now you'd be out of your mind to buy a used 3090/3090 Ti for the same price as a 4070 Ti, roughly the same performance but at 200W higher power consumption (and the increased heat/noise that comes with it).