Originally Posted by Snabeltorsk
Beqause the power consumption would be higer than intended.
Originally Posted by juniordnz
So they cap the card to maintain a low power consumption? Even though the GPU they designed being able to benefit a lot from more power?
That seems to me like designing a ferrari engine and then putting a limiter to it so it doesn't spend too much fuel
Well, I'm just gonna throw this
out there. About 15 people will reply to me and say that their overvolted card has had no issues for the past 3 years while mining 24/7....OK, sure, that's possible (no sarcasm).
According to Nvidia Senior PR Manager Bryan Del Rizzo, overvolting is supported "up to a limit," in order to "protect the life of the product." Del Rizzo claims Nvidia won't stop graphics card makers who want to overvolt their products wildly or want to provide users that freedom via voltage controls. However, doing so disqualifies products from receiving warranty support from Nvidia. Add-in board makers are free to provide their own warranty coverage, of course.
MSI's GeForce GTX 680 Lightning Edition card reportedly offered users too much leeway to tweak voltages and had to be scaled back to comply. Del Rizzo notes that MSI chose warranty coverage over extreme overvolting support, just as EVGA appears to have done with its Classified card.
Now, someone will say, "but what about CPUs? Intel doesn't sell a 4.5GHz i7-6700K!" But
, Intel also has not locked overvolting like NVIDIA (and interestingly, AMD, too!). You can go right ahead and ram 2.0V through your i7-6700K, ain't nobody gonna care except maybe your wallet. If Intel wanted...they could've locked voltage support down (like they later "forced" motherboard manufacturers to put out BIOS updates to disable non-K overclocking).
However, NVIDIA and
almost all their AIB partners decided to stick with that warranty instead of the overvolting. Why?
I'm not saying this is true (GPUs are more prone to voltage-degradation than, say, CPUs)--I'd actually love to have a discussion about this. Who has something to contribute, besides
I mean, why would both
NVIDIA and AMD lock down both voltages, even
on their high-end cards?Edited by ikjadoon - 8/10/16 at 7:21pm