Overclock.net banner

In your opinion, is the GTX 1080 Founders a terrible value?

  • Yes

    Votes: 14 87.5%
  • No

    Votes: 2 12.5%
1 - 5 of 5 Posts

2,231 Posts
Discussion Starter · #1 ·
I know that this thread is going to attract some controversy, but do you think that the GTX 1080 is a totally poor value?

To be honest, Pascal fell below my expectations. The performance improvements are lower than what expected from a jump from 28nm to 16/20 nm hybrid (about 1.5 nodes or so worth of node shrinks). A big part of this is because of physics - we are at the point where the marginal benefit per node shrink is going down as we approach the laws of physics. Nonetheless, if you think about it, it's quite a bit below

With the release of the Founders edition, Nvidia has begun charging a premium on top of the already overpriced GTX 1080, which sells for $600 USD for the custom AIB and $700 USD for the Founder's Edition.
  1. $600 USD is way too much for a mid-ranged die, particularly when one considers that the last generation of cut down dies (780, later 780Ti, and 980Ti sold for $650 USD); ditto for the $700. We know a big "Titan Pascal" is on its way, as is a cut down version.
  2. The Founder's reference design is not high quality hardware at all and certainly not worth the price premium.
  3. Without a Hardware Scheduler, this will not age well in the DX12 era either.

Let me explain below.

$600 USD for a midranged die?
Since Fermi, Nvidia's strategy has been to release a 300mm^2 GPU first on each new process then move up (or smaller).
  • GTX 680 > Titan
  • 750Ti > GTX 980 > Titan X
  • GTX 1080 > big Pascal

Although it is true that with 28nm ending the cost per transistor scaling and 16/20 nm may end up costing more per gate, I fail to see why a mid-ranged GPU like the GTX 1080 or 1070 would cost $600 USD (or worse, more than a "big die" from the previous generation in the case of the Founders).

I'd be worried about this because the TItan set a very dangerous precedent back in 2013 - they know they can get away with high pricing and there is a large enough audience that will buy. I'm worried about that precedent. They charged a big premium for the GTX 680 for a midranged card at $500 USD and later the GTX 980 was $550 USD. This card just sold out at $600 USD - $700 USD for the Founders. At this rate, we are going to see prices more than $600 USD for future releases and more for a "Founders" edition.

We all know that the GTX 1070 will be a lot cheaper and a lot better value. There is also the cut down version of the big die (let's call this the 1080Ti for the sake of it until we know what the real name is). That will probably be perhaps $650 USD (perhaps now $750 USD with prices going up).

Unfortunately, it gets worse for the Founders. You are paying a price premium for worse than what an AIB might be able to offer. The AIBs will likely absorb the margins for their superior cards.

In the short term, this could be worsened by pricing if there are shortages due to poor yields. It's not the first time that it has happened nor is Nvidia not the only vendor (AMD a couple of years ago had a huge price hike due to mining before the ASICs took over).

The Founders is not a good design at all
One of the things that struck me during the Pascal presentation was the amount of times that "craftsmanship" was used. Unfortunately, the PCB is not designed to handle a lot of current.

PCB Comparison
This is the GTX 1080 PCB:

Not a PCB designed to handle a lot of current that would be needed for overclocking.

Anyways, independent reviews have shown that this will severely limit overclocking on the reference design:

PCGH tried testing with an Accelero Xtreme IV and found the GPU power limited:
It's power limited to 225W. Hopefully we will see independent AIB designs get better results. At this point, it remains to be seen whether Pascal scales well with voltage (like Fermi and Keplier did) or if it is more like Maxwell (which didn't do so well after about 1.25V). Depending on Project Greenlight though, we will need to see the voltage unlocked ala custom BIOS.

But the point remains. This is an EVGA 1080 Classified PCB shot.

Unlike the reference, it is actually worth a price premium in that you are paying for a better quality components that could potentially overclock more aggressively past the 225W limit. Not only more phases, but higher quality ones (those are DirectFET mosfets).

I'm sure there will be other AIB PCBs like it that will do better. Lightning, Hall of Fame, Zotac has something they call the Zotac GTX 1080 PGF (they will release a Zotac 1080 Amp Extreme for the rest of the world) are all examples of better PCBs.

Cooling and clocks
Equally important is that the custom AIBs, which will likely use axial coolers are likely to be able to sustain the boost clocks.

Note here:

Note that the GPU is not holding the boost clocks.

Hopefully a better designed AIB card can resolve this with a good axial cooler. I suspect that an overclocked "big" Pascal will likely need water to sustain overclocks. The point I want to make though is that the cooler cannot sustain the 2GHz+ boost that we saw at the demo. To do that, it is likely that we will need a custom AIB PCB with a very well designed axial cooler (likely a 3 slot monster like the 290X Lightning pictured below) or perhaps water + a custom PCB. Keep that in mind if you do watercool - the 225W limit makes the reference pointless for watercooling; you'll need a custom AIB and one with a waterblock (unless you plan to use a universal block anyways).

This GPU is not going to age very well
The main reason is that after the power inefficient Fermi, Nvidia for Kepler has abandoned the hardware scheduler in favor of a software scheduler. This has led to good power efficiency and die space savings for the DX11 era, but this is now a drawback in the DX12 era. Until Volta is released, Nvidia's GPUs cannot do async compute.

We do seem to see a fall off in performance over time due to the more aggressive use of software based drivers that Nvidia uses.

  • When the first GCN GPU came out, the HD 7970, it was regarded as inferior to the GTX 680. It was hotter and slightly slower, forcing AMD to respond with a GHz edition. That has not proven to be the case with time, and the 7970 has made huge relative gains, particularly with console ports that also run on consoles running GCN based GPUs.
  • The 290X was slightly slower than the 780Ti and used more power (although I should note that they were more or less tied at 4k). Again, the 290X has made huge relative gains since 2013. If the Ashes is to be believed, it may end up being "near' the 980Ti for DX12 in some games.
  • Finally, we are seeing the GCN GPUs make considerable gains over Maxwell. I expect that by 2018, we will see the Fury X overtake the 980Ti, save perhaps where the 4GB of HBM becomes an issue. Similarly, I expect that due to the async compute, the 290X/390X (Hawaii) will likely overtake the midranged 980/970 Maxwell GPUs.
Maxwell will likely follow the same trajectory that Kepler did. So too will Pascal once Volta comes out (no hardware scheduler, although the design on the SM level does resemble GCN more than previous iterations). The driver optimizations will not be a priority for Maxwell and without a hardware scheduler, it cannot keep up.

In many ways, this is worse than the GTX 680, because DX12 is only a couple of years from being mainstream. If you plan on keeping the GPU for more than say, 2-3 years, with more DX12 titles coming, this could be a problem.

This may be an extreme case: http://www.extremetech.com/gaming/223567-amd-clobbers-nvidia-in-updated-ashes-of-the-singularity-directx-12-benchmark

But it does look like until Nvidia releases Volta, AMD will have a pretty big advantage in DX12. Perhaps not as big as Ashes (although I suspect that if you are an RTS and 4X lover like I am, AMD will do very well), but still substantial nonetheless.

Personally I"m hoping that the rumors that AMD has pushed Vega forward to late 2016 are true.

Concluding remarks
I'm not saying you should not buy a GTX 1080, but there are some pretty big drawbacks right now to doing so. I made this thread to draw attention to those. If you want to buy, buy as an informed consumer.

The idea though that the GTX 1080 is "65C, cool as a cucumber!" (which is what JHH said on the Pascal presentation) though is factually inaccurate, at least with the reference PCB and cooler anyways, which is what was implied.

I get that it is his job to build up excitement, but the hardware quality is lacking. Even contrast to the Fury X, which was a bad overclocker and had coil whine problems - at least the power delivery was designed handle the load and arguably the AIO as well. So too was the >2 GHz tech demo. That isn't going to happen unless you have a very well cooled, voltage-unlocked, custom PCB.

In conclusion, I am forced to say that this is not a good value for most consumers. At the very least, wait for the AIB GPUs which will be cheaper and better. I think that Polaris 10/11 will be aiming for the mid-low end market (although it may do surprisingly well in DX12 benchmarks). I think that most enthusiasts should wait. Wait for custom PCBs of "big" Polaris and Vega. Wait for the benchmarks, particularly the DX12 ones.

Top kek
3,673 Posts
Firts, this is not a midrange GPU, its a high-end one. Second, they are charging the not 600$, but 700$ for the reference GPU. Third, the AIBs will sell their models for 700$+.

My answer: Yes, its a terrible pick, totally not worth it. As shown on the TPU pictures for the temps and boost clocks, this card is not quiet/cool at all, and the so overhyper 2GHZ CLOCKS **** ROFLCOPTER PWNAGE, is totally a wiff. Yes, AIB models will be able to get higher clocks before throttling, but they will not be able to hit 2Ghz without water. Which means water block and so on, which means more $$$.
  • Rep+
Reactions: LAKEINTEL

313 Posts
They are charging $700 for a 314mm^2 die, that is very high historically. It marginally beats the high end from the previous generation, its 60-100% performance gain over the card it replaces and even that was a significantly bigger die with the 980 being 394mm^2. The 1070 is also quite a price hike from the 970 which on release was $329 and with MSRP for the 1070 being $379 and FE being more its a sizeable increase. Its more performance and a new node but a big part of why the performance isn't as much as we might have expected is the significant decrease in die size that came with the new node, 80% the size of the previous generation and so we got a lot less extra transistors and hardware than we might have expected to see at this price point. It ought to be entering the market quite a bit cheaper than the 970/980 were based on the die size. I know the price of a transistor has gone up 12% but they reduced the size of the chip 20%, to be offering equivalent value based on manufacturing cost it would 8% cheaper which makes the significant price hike so much more notable.

Less hardware for more money, its not exactly a good deal. But what are your options? You can go with AMDs mainstream release and you'll get a 214mm^2 die which sits in around today's GPU performance offering nothing we haven't seen before, but it does lower the price, or you can pay Nvidia's ever increasing premium on a card that does increase the performance bar and add a tonne of potentially cool features like SMP, which I fear will never get used. If you stick with the 28nm cards its a worse deal, the 980 ti is more expensive and slower.

My main concern for the coming generation of cards is DX12. Async compute is potentially a concern on Nvidia but since they refuse to talk about it and we have no idea how much it'll be used it is really hard to gauge how much that is going to matter, could be a lot or it could be nothing. What really concerns me about DX12 however is the way dual adapters work in it. Handing the dual card keys to the developers was an atrocious idea, they have zero incentives to be supporting dual cards and they regularly make quite horrifically badly written ports that take very little advantage of the PC and its features. So when we give them the dual adapter support to write we effectively kill off dual cards. This for me is the real problem with DX12, its going to kill off using two lower performance cards and where the 2x970 was in hindsight far better than a single 980 the future of 2x1070 is much more of a gamble over the 1080. Its cut down more than the 970 was and its more expensive. You could be down 30% performance if SLI does not scale and the upside is just 40%, whereas with the 970/980 you would be down 15-20% and the upside was 60-70%. Ever increasing numbers of titles will be using graphics features that don't scale, common engines don't support it, VR despite really needing it shows no signs of adopting last years solutions let alone this years solution to the issue. Its a bitter pill to swallow but more and more its looking like dual cards are dead, just at the time when we need them even more.

Its objectively a worse deal with this generation both with the 1080 and the 1070 than the previous Nvidia upper mainstream releases.

6,479 Posts
The problem is that nvidia successfully deceived people thinking that a 700 card is faster than the Titan X by a large margin, but that's definitely not true. And as a direct result the Titan X and 980Ti lost their value like rocks even though they're still excellent cards at 1440 and in some cases 4K with mod bios + AIB cooler.

I wouldn't have bought the 1080 and sold my Titan X, but the current worth perceived by idiots for the GM200 cards made me to.

Premium Member
11,045 Posts
Originally Posted by ku4eto View Post

Firts, this is not a midrange GPU, its a high-end one. Second, they are charging the not 600$, but 700$ for the reference GPU. Third, the AIBs will sell their models for 700$+.
It is a midrange GPU, it just doesn't seem like it when the rest of the Pascal family isn't launched yet. Same thing happened to the GTX 680, midranged card sold at flagship prices, and then the Titan came along and made it not only worthless but quickly forgotten. This is the future for the 1080, short lived.

It is a solid GPU, but it's performance is way less than we all expected sadly.
  • Rep+
Reactions: LAKEINTEL
1 - 5 of 5 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.