Overclock.net › Forums › Industry News › Hardware News › [VRZ] Nvidia's Green Light Program Overclocking Limitations - Origins and Implications
New Posts  All Forums:Forum Nav:

[VRZ] Nvidia's Green Light Program Overclocking Limitations - Origins and Implications - Page 17

post #161 of 182
Quote:
Originally Posted by Imglidinhere View Post

If you honestly think the G92 rebrand is that bad, then you did it wrong. tongue.gif
I'm just pissed they released their next-gen core as the 8800gt which was faster than the 8800gts/gtx (which broke the gs->gt->gts->gtx scheme), which caused them to release a second, g92-based 8800gts. It was confusing and deceptive. Then the 9 series was released with the same core, people bought 9800gt cards to replace their 8800gt's and discovered it was the same card. So we had different cards with the same name, and the same card with different names. Etc. All because ATi released the HD 3870. It was a knee-jerk reaction to put a next-gen core out before the marketing department figured out they should replace the 8 with a 9.

We, as computer enthusiasts, live and breathe this stuff, and even we were getting confused/arguing about which card was actually which. Imagine how lost (and open to being abused) the average consumer was.
post #162 of 182
Quote:
Originally Posted by Imglidinhere View Post

They weren't cheap VRMs, there just weren't enough. The GTX 580 had six, but the GTX 570 only had four. It was merely one too few. tongue.gif If they'd made the GPU with five, the GTX 570 would have been fine. tongue.gif
I stand corrected redface.gif My point remains, I feel that Nvidia tried to save money in the wrong place with 570 and 590 and it backfired. Now they took voltage adjustment out of Kepler cards in fear that their VRM design isn't robust enough for overvoltage. I don't believe for a second that 28nm process suddenly made voltage adjustment unsafe, there hasn't been reports of massive amount of AMD 7xxx cards failing.
Omega Ray
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7 2700K @ 4,9GHz Asrock Z77 OC Formula EVGA 780 Ti Classified @ 1315Mhz  Samsung Green 1866MHz 2x4GB 
Hard DriveCoolingCoolingCooling
300GB Intel 510 & 520 SSD's EK Supremacy Evo White EK-FC780 GTX Classy - Nickel  Alphacool XT45 420 push/pull 
CoolingOSMonitorKeyboard
EK-XRES 140 D5 Vario Windows 8.1 Pro X64 Asus VG236HE 120Hz  QPAD MK-50 Cherry MX Red 
PowerCaseMouseAudio
Corsair AX860i Phanteks Enthoo Primo White Mionix Naos 7000 Creative Z + Denon AH-D2000 
  hide details  
Reply
Omega Ray
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7 2700K @ 4,9GHz Asrock Z77 OC Formula EVGA 780 Ti Classified @ 1315Mhz  Samsung Green 1866MHz 2x4GB 
Hard DriveCoolingCoolingCooling
300GB Intel 510 & 520 SSD's EK Supremacy Evo White EK-FC780 GTX Classy - Nickel  Alphacool XT45 420 push/pull 
CoolingOSMonitorKeyboard
EK-XRES 140 D5 Vario Windows 8.1 Pro X64 Asus VG236HE 120Hz  QPAD MK-50 Cherry MX Red 
PowerCaseMouseAudio
Corsair AX860i Phanteks Enthoo Primo White Mionix Naos 7000 Creative Z + Denon AH-D2000 
  hide details  
Reply
post #163 of 182
Quote:
Originally Posted by Seid Dark View Post

It's sad to see people on overclocking site defending gimped card designs frown.gif Sure, subpar VRM's may be fine in low-end products but it's just plain wrong when you pay 500$ for enthusiast card and it has cheap components. If Nvidia continues this I will switch to AMD and never look back.

Especially when they believe the BS reason nVidia gave despite it clearly being wrong.
Quote:
Originally Posted by hangallfanboyz View Post

Quote:
Originally Posted by Brutuz View Post

If you're defending this move, you're ignorant, stupid or a fanboy..There is no defense, it means cards won't last as long, that nVidia doesn't have as much tolerance for bad VRM batches (As the GTX 570/590 debacle proves) and it's good for no-one but nVidia, even if you don't want to OC/OV you should be complaining..How would you feel if Corsair and Antec started using cheap, crappy components in their PSUs? It saves money, after all. Why is it dangerous voltage? AMD is on the exact same process and these "dangerous voltages" don't seem to harm them..Or the owners of 680s that do OV or are OVed, that's pure crap from nVidia to cover up the whole cheap board stuff. You might enjoy taking the shaft from companies, but they (Most of them, anyway) already have enough money, I'd rather them spend a bit more to ensure that if I want to continue using their GPU for years to come that I can, hell, I'm still using my 9800GTX+ as a spare card. As for implying me to be a fanboy..I'm probably one of the least biased people on this forum, I run Intel, AMD and nVidia regularly, my main rigs GPU was a GTX 470, my server has a 9800GTX+ in it (No IGP), my laptop has a HD545v and my girlfriends PC a HD7850, I buy whatever is best for me at my price-range and prefer to compare real-world applications rather than benchmarks that inflate non-existent differences.
Yes, AMD is sinking fast despite significantly reducing their debt. rolleyes.gif You clearly are one of the people who take things at face value, sure, it's great for nVidia's financial books to use crappy VRMs as 1) It means they pay less per card, 2) some users are forced to buy newer cards as their old ones are going to fail sooner, 3) it reduces the used market...If there's rumours of OVing going around a few people at least will be iffy about buying used for the chance of the former owner OVing the card, but I don't give a crap about nVidia, AMD or Intel, why should I? They don't give a crap about me nor do I pretend that they do, I want what is best for the consumer and so should you unless you're working for one of those companies in a position where your income would be directly effected by how much they're profit they're making per part.
Your post is just a bunch or hot air, time will tell who is right. Also, I could care less what you have in your sig rig becaus it's not about that.

And yours isn't? You have no proof beyond nVidia's PR whereas at least mine can be proven by looking at part numbers.
Now, Kepler uses 228w total power consumption assuming Techpowerups card only results are correct, as you can quite clearly see here, it has a 4 phase VRM (Count the R22 parts, those big grey things wink.gif), now tell me..how many good VRM systems for something that uses 200w+ of power only use 4 phases? I'm not a VRM circuitry professional but the only similar looking things are the eR33 parts near the top which are probably for the memory and auxiliary equipment..which means at best it's 4+2 phase. If I could read the actual part numbers, I'd definitely look them up and see exactly how much current/wattage each phase is capable of to see if its above adequate, you can also see that nVidia was definitely testing a 5th phase with the empty spot in the middle.
Quote:
Originally Posted by Nowyn View Post

Quote:
Originally Posted by Brutuz View Post

If you're defending this move, you're ignorant, stupid or a fanboy..There is no defense, it means cards won't last as long, that nVidia doesn't have as much tolerance for bad VRM batches (As the GTX 570/590 debacle proves) and it's good for no-one but nVidia, even if you don't want to OC/OV you should be complaining..[...]

I haven't heard about any VRM related issues with 6xx series cards, so by definition they are sufficiently good at worst, since there is no problems with them.

Just cause 570 had issues.doesn't matter that 2 years later new model are susceptible to the same issues. I remember AMD card having crap VRM cooling with them hitting up to 100 degrees under load and also dying, doesn't mean then 7970 has crappy VRM cooling, does it?

And the voltage lock affected only high-end models with custom power circuitry that is definitely enough to handle extra juice. Also AIB partners still can sell cards with voltage controls, their margins are likely not that high, so they refuse to cover warranty by themselves.

I tend to believe that GK104 are pushed to it's limits as is and extra voltage might cause silicon degradation. It's a known fact that GK104 doesn't clock that much better with added voltage and overclocking past certain clock yield lower gains per MHz. NVIDIA surely know the innards of their chips better and what they can handle. It's is a mid range chip promoted to high end due to lackluster AMD's performance increases over 580.

I guess we'll see if NVIDIA really tried to save on reference designs and RMAs or it's a current chip issue, when 7 series hits.

Erm, weren't Lightnings and the like with Voltage control hitting 1.3Ghz/1.4Ghz stable with air? That's significantly higher than the 1.2Ghz average for most 680s. Also, they're using the exact same process as the HD7k series which can happily have 1.25v 24/7, nVidia specifies 1.175v max..Why is it so low?

Yes, having a bad VRM circuitry on the 570/590 doesn't mean it's going to definitely happen on the 6x0 series...But considering that it was clearly designed to cheapen costs shows nVidia is (Obviously) concerned about board costs, as do many parts of the 6x0 series. (eg. 670 board size, using GK104 instead of GK100 as the flagship GPU, etc)
Quote:
Originally Posted by Imglidinhere View Post

Quote:
Originally Posted by Dyson Poindexter View Post

Ever since the g92 rebranding, I can't stand nVidia's market practices.

If you honestly think the G92 rebrand is that bad, then you did it wrong. tongue.gif
Quote:
Originally Posted by Seid Dark View Post

Nvidia has had cheap VRM's before, both 570 and 590 were notorious for it. First Fermi generation (470 and 480) had good PCB design and no problems with overvolting. Here's reference GTX 670 PCB, does that look like a high-end card?

They weren't cheap VRMs, there just weren't enough. The GTX 580 had six, but the GTX 570 only had four. It was merely one too few. tongue.gif If they'd made the GPU with five, the GTX 570 would have been fine. tongue.gif

Exactly, nVidia cut part costs to a level which should be enough but couldn't handle certain problems such as bad batches of VRMs which is what actually bit the 570 and 590 in their rears, this is an exact
post #164 of 182
*applause*
AMD Transplant
(11 items)
 
To be a NAS
(13 items)
 
Death Kühler
(14 items)
 
CPUMotherboardGraphicsRAM
965BE GA-78LMT-S2P MSI TFIII 7950 Corsair Vengence 
Hard DriveCoolingOSMonitor
Samsung 830 Noctua NH-D14 Win7 Pro FX2490HD 
PowerCaseMouse
Seasonic X660 Antec Three Hundred Saitek Rat 7 
CPUMotherboardGraphicsRAM
AMD Athlon 64 X2 4200+ Gateway GM5072 AMD HD5550 2.0 Gb 
Hard DriveMonitorKeyboardPower
Seagate Barracuda 500Gb Samsung FX2490 MS Keyboard 3000 300w generic 
Mouse
MS Intellipoint 3000 
CPUMotherboardGraphicsRAM
2500k GigaByte Z68M-D2H-B3 MSI 560 Ti TFIII Corsair Vengance 
Hard DriveCoolingOSMonitor
Samsung 830 Antec Khuler 920 Win7 HP Samsung FX2490 
KeyboardPowerCaseMouse
Microsoft Keyboard 3000 Seasonic X-660 Antec Three Hundred Microsoft Mouse 3000 
Mouse PadAudio
None - Blue Track baby Senn HD428 
  hide details  
Reply
AMD Transplant
(11 items)
 
To be a NAS
(13 items)
 
Death Kühler
(14 items)
 
CPUMotherboardGraphicsRAM
965BE GA-78LMT-S2P MSI TFIII 7950 Corsair Vengence 
Hard DriveCoolingOSMonitor
Samsung 830 Noctua NH-D14 Win7 Pro FX2490HD 
PowerCaseMouse
Seasonic X660 Antec Three Hundred Saitek Rat 7 
CPUMotherboardGraphicsRAM
AMD Athlon 64 X2 4200+ Gateway GM5072 AMD HD5550 2.0 Gb 
Hard DriveMonitorKeyboardPower
Seagate Barracuda 500Gb Samsung FX2490 MS Keyboard 3000 300w generic 
Mouse
MS Intellipoint 3000 
CPUMotherboardGraphicsRAM
2500k GigaByte Z68M-D2H-B3 MSI 560 Ti TFIII Corsair Vengance 
Hard DriveCoolingOSMonitor
Samsung 830 Antec Khuler 920 Win7 HP Samsung FX2490 
KeyboardPowerCaseMouse
Microsoft Keyboard 3000 Seasonic X-660 Antec Three Hundred Microsoft Mouse 3000 
Mouse PadAudio
None - Blue Track baby Senn HD428 
  hide details  
Reply
post #165 of 182
Quote:
Originally Posted by Brutuz View Post

it has a 4 phase VRM (Count the R22 parts, those big grey things wink.gif), now tell me..how many good VRM systems for something that uses 200w+ of power only use 4 phases?

If they had the solder stencil and everything set up for 5 phases, it's apparent that the decision to move to 4 phases was very late in the design cycle. Do we have any beta hardware photos? The card was obviously designed to run with 5 phases, but it was gimped likely as a cost saving measure.
post #166 of 182
Quote:
Originally Posted by Brutuz View Post

Especially when they believe the BS reason nVidia gave despite it clearly being wrong.
And yours isn't? You have no proof beyond nVidia's PR whereas at least mine can be proven by looking at part numbers.
Now, Kepler uses 228w total power consumption assuming Techpowerups card only results are correct, as you can quite clearly see here, it has a 4 phase VRM (Count the R22 parts, those big grey things wink.gif), now tell me..how many good VRM systems for something that uses 200w+ of power only use 4 phases? I'm not a VRM circuitry professional but the only similar looking things are the eR33 parts near the top which are probably for the memory and auxiliary equipment..which means at best it's 4+2 phase. If I could read the actual part numbers, I'd definitely look them up and see exactly how much current/wattage each phase is capable of to see if its above adequate, you can also see that nVidia was definitely testing a 5th phase with the empty spot in the middle.
Erm, weren't Lightnings and the like with Voltage control hitting 1.3Ghz/1.4Ghz stable with air? That's significantly higher than the 1.2Ghz average for most 680s. Also, they're using the exact same process as the HD7k series which can happily have 1.25v 24/7, nVidia specifies 1.175v max..Why is it so low?
Yes, having a bad VRM circuitry on the 570/590 doesn't mean it's going to definitely happen on the 6x0 series...But considering that it was clearly designed to cheapen costs shows nVidia is (Obviously) concerned about board costs, as do many parts of the 6x0 series. (eg. 670 board size, using GK104 instead of GK100 as the flagship GPU, etc)
Exactly, nVidia cut part costs to a level which should be enough but couldn't handle certain problems such as bad batches of VRMs which is what actually bit the 570 and 590 in their rears, this is an exact

wow.. you really honor your name thumb.gif
My System
(13 items)
 
  
MotherboardGraphicsRAM
ASUS M5A97 XFX 7770 Kingston HyperX  
  hide details  
Reply
My System
(13 items)
 
  
MotherboardGraphicsRAM
ASUS M5A97 XFX 7770 Kingston HyperX  
  hide details  
Reply
post #167 of 182
Quote:
Originally Posted by Brutuz View Post

Lets all buy cheap PSUs and cheap motherboards, they just work, right? Sure, they won't last anywhere nearly as long but they will work. rolleyes.gif
Do you honestly buy expensive motherboard or PSUs because of longevity? Decent VRM's on a motherboard will help with overclocking, because intel CPUs don't appear to be quite so susceptible to electromigration. Not to mention all the features of a decent mobo, like extra PCI lanes or SATA controllers.

Quote:
Originally Posted by Brutuz View Post

Anyone want to read up the specs on the parts used in nVidia's VRMs? Maybe we can get an true, completely unbiased and fully proven idea of quality then, because from what I've seen it's more due to cheap VRMs, unless you're counting nVidia's word...Which is pretty unbiased, right? If it's electromigration, why aren't owners of 680s that support OVing reporting heaps of deaths or degradation as is typical of electromigration?
Because electromigration occurs over time, and because you don't have access to NV's RMA statistics.

If there was not a high RMA rate (or a prediction of a high number of future RMAs), NV would not have taken action. It's a business, and taking action that will knowingly hurt sales for zero net gain is simply illogical.

Quite simply I believe them, because they have absolutely nothing to gain by lying.
Quote:
Originally Posted by Brutuz View Post

Why is AMD (Who are using the exact same process) not reporting electromigration? The VRM design (as far as I can actually tell) is done as cheap as possible...
Reference 680: 1.215v
Reference 7970: 1.175v

Might be the reason.

But if you can find evidence contradicting the NVidia spokesperson, please, be my guest. Show it. I will ask you though, what the hell do NVidia have to gain by lying? If it was simply cheap VRMs, they could sell a superclocked edition with decent VRMs. Or they could just say "yeah it's cheap VRMs", 99% of consumers wouldn't give a crap anyway because they don't know the difference between that and electromigation.
Quote:
Originally Posted by Brutuz View Post

the GTX 570 and GTX 590 having a few cards blowing up at stock when you had a bad batch of VRMs capable of easily delivering less current prove that nVidia really is making VRMs that can power the GPU at rated speeds but not much higher, it is gimped assuming that is true.
What do the GTX 570 or 590 have to do with the 680? All I said is that the reference 680 isn't a flawed or "gimped" design, it's a perfectly good design that's already been pushed to it's limits.

And don't get me wrong, I'm not defending the card, it's about on par with AMD's current generation. Maybe a little worse. I just disagree with the stereotypical angry internet swarm, who tend to be so quick to assume random corporate conspiracy and injustice is the cause of all issues.
Serigatum
(13 items)
 
  
CPUMotherboardGraphicsRAM
4770k @ 4.7Ghz Asus P87 Deluxe EVGA GTX 780 Corsair Vengeance 2400Mhz 
Hard DriveHard DriveCoolingOS
Raid0: 2x512GB Samsung 840 Pro 2GB WD Black NZXT Kraken x60 Windows 8 
MonitorKeyboardPowerCase
Dell U2412M Razer Blackwidow Ultimate Corsair AX850 NZXT Phantom 820 
Mouse
Razer Ouroboros 
  hide details  
Reply
Serigatum
(13 items)
 
  
CPUMotherboardGraphicsRAM
4770k @ 4.7Ghz Asus P87 Deluxe EVGA GTX 780 Corsair Vengeance 2400Mhz 
Hard DriveHard DriveCoolingOS
Raid0: 2x512GB Samsung 840 Pro 2GB WD Black NZXT Kraken x60 Windows 8 
MonitorKeyboardPowerCase
Dell U2412M Razer Blackwidow Ultimate Corsair AX850 NZXT Phantom 820 
Mouse
Razer Ouroboros 
  hide details  
Reply
post #168 of 182
Quote:
Originally Posted by Dyson Poindexter View Post

Quote:
Originally Posted by Brutuz View Post

it has a 4 phase VRM (Count the R22 parts, those big grey things wink.gif), now tell me..how many good VRM systems for something that uses 200w+ of power only use 4 phases?

If they had the solder stencil and everything set up for 5 phases, it's apparent that the decision to move to 4 phases was very late in the design cycle. Do we have any beta hardware photos? The card was obviously designed to run with 5 phases, but it was gimped likely as a cost saving measure.

I want to know this too.
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

Lets all buy cheap PSUs and cheap motherboards, they just work, right? Sure, they won't last anywhere nearly as long but they will work. rolleyes.gif
Do you honestly buy expensive motherboard or PSUs because of longevity? Decent VRM's on a motherboard will help with overclocking, because intel CPUs don't appear to be quite so susceptible to electromigration. Not to mention all the features of a decent mobo, like extra PCI lanes or SATA controllers.

Actually, yes, I'm going to spend extra on an 80 Plus Platinum PSU because I know that I'll still be able to use it in 8 years and it ends up way cheaper. Motherboards? Not so much, but it definitely factors into my motherboard choice. (Hence why when I reassemble my main rig, I'll be running a Gigabyte 990FXA-UD3 instead of the ASRock.)
Go on, buy a Shaw 860w PSU, it's cheaper and still has the same wattage, plus I know of someone who ran one quite well with a i7 920 and HD5870, quality be damned, right? rolleyes.gif

Electromigration has zero to do with it, life-span does, if you get two 850w PSUs, one with ultra high-quality components and one with merely adequate components the adequate one will fail much sooner, ever heard of redundancy? The GTX 680 uses ~230w of power, if nVidia has only designed a VRM rated up to 250w then in 4-5 years that will be really pushing the VRMs to power the GPU at stock speeds, I prefer to keep spare GPUs rather than buying cheap low-end ones all the time.
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

Anyone want to read up the specs on the parts used in nVidia's VRMs? Maybe we can get an true, completely unbiased and fully proven idea of quality then, because from what I've seen it's more due to cheap VRMs, unless you're counting nVidia's word...Which is pretty unbiased, right? If it's electromigration, why aren't owners of 680s that support OVing reporting heaps of deaths or degradation as is typical of electromigration?
Because electromigration occurs over time, and because you don't have access to NV's RMA statistics.

If there was not a high RMA rate (or a prediction of a high number of future RMAs), NV would not have taken action. It's a business, and taking action that will knowingly hurt sales for zero net gain is simply illogical.

Quite simply I believe them, because they have absolutely nothing to gain by lying.

Yes, it does, and you get degradation at the same time, the chip doesn't just die one day, if you could OV a 680 and it was affected by electromigration, then the stable 1375Mhz OC might go to 1360Mhz, then down to 1325Mhz...Why aren't I seeing reports of that on 680s that are OVed?
Erm, yes, because if nVidia came out and said "Yeah, we're gimping board designs to save a few bucks" to their main market who also have a decent chance of understanding that this will negatively affect OCing is going to do nothing but benefit them...nVidia has also lied plenty of times in the past, they certainly do have stuff to gain from this.
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

Why is AMD (Who are using the exact same process) not reporting electromigration? The VRM design (as far as I can actually tell) is done as cheap as possible...
Reference 680: 1.215v
Reference 7970: 1.175v

Might be the reason.

But if you can find evidence contradicting the NVidia spokesperson, please, be my guest. Show it. I will ask you though, what the hell do NVidia have to gain by lying? If it was simply cheap VRMs, they could sell a superclocked edition with decent VRMs. Or they could just say "yeah it's cheap VRMs", 99% of consumers wouldn't give a crap anyway because they don't know the difference between that and electromigation.

Performance gains for the next generation, if Kepler could hit 1.3-1.4Ghz then it'd be no doubt very able to match or beat the newer products..A lot of people saw this with the GTX 460 vs the 560.
Also, nVidia's stance at first was that it's fine to make new designs that OV but not to allow software OVing of reference designs, hence why the Lightning and Classified supported it in the first place..Why did they change this randomly? There's definitely electromigration issues (albeit way overblown by a lot of people, much like how everyone thought that AMD 45nm CPUs were only safe up to 1.425v for the longest time until so many people ran 1.55v 24/7 that it became obvious they could) but it's also an artificial limitation to help nVidia make the GTX 780 look more compelling.
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

the GTX 570 and GTX 590 having a few cards blowing up at stock when you had a bad batch of VRMs capable of easily delivering less current prove that nVidia really is making VRMs that can power the GPU at rated speeds but not much higher, it is gimped assuming that is true.
What do the GTX 570 or 590 have to do with the 680? All I said is that the reference 680 isn't a flawed or "gimped" design, it's a perfectly good design that's already been pushed to it's limits.

And don't get me wrong, I'm not defending the card, it's about on par with AMD's current generation. Maybe a little worse. I just disagree with the stereotypical angry internet swarm, who tend to be so quick to assume random corporate conspiracy and injustice is the cause of all issues.

Nothing, but they are an example of what happens when you gimp on VRM designs too much.
I usually tend to disagree with them, however nVidia is nothing but the teenage girl of computer companies. (Remember Woodscrewgate? PhysX in 3DMark Vantage? GeForce FX failing so hard they started to do driver optimisations that reduced game quality so they could have higher performance? Not that AMD/ATI is innocent of that.)
post #169 of 182
Quote:
Originally Posted by Brutuz View Post

I want to know this too.
Actually, yes, I'm going to spend extra on an 80 Plus Platinum PSU because I know that I'll still be able to use it in 8 years and it ends up way cheaper. Motherboards? Not so much, but it definitely factors into my motherboard choice. (Hence why when I reassemble my main rig, I'll be running a Gigabyte 990FXA-UD3 instead of the ASRock.)
Hehe, I guess some people think further ahead than me then biggrin.gif I actually got a little annoyed my last rig lived as long as it did, but then I generally only have one or two builds running at the same time tongue.gif

Quote:
Originally Posted by Brutuz View Post

Go on, buy a Shaw 860w PSU, it's cheaper and still has the same wattage, plus I know of someone who ran one quite well with a i7 920 and HD5870, quality be damned, right? rolleyes.gif
Electromigration has zero to do with it, life-span does, if you get two 850w PSUs, one with ultra high-quality components and one with merely adequate components the adequate one will fail much sooner, ever heard of redundancy? The GTX 680 uses ~230w of power, if nVidia has only designed a VRM rated up to 250w then in 4-5 years that will be really pushing the VRMs to power the GPU at stock speeds, I prefer to keep spare GPUs rather than buying cheap low-end ones all the time.
Yes, it does, and you get degradation at the same time, the chip doesn't just die one day, if you could OV a 680 and it was affected by electromigration, then the stable 1375Mhz OC might go to 1360Mhz, then down to 1325Mhz...Why aren't I seeing reports of that on 680s that are OVed?
Overvolted 680s have only been around, what, 8 months? We also haven't seen many reports of VRMs failing on them, my point is only that it looks like NV is expecting this to become an issue in the future.

It does not make sense for NVidia to claim there is an issue with their product, if there isn't an issue with their product. Whether they say it's electromigration, or blame it on manufacturers pushing crap VRMs too far, makes little difference. The backlash is always going to be the same.

(Also, on the overvolted boards, aren't the VRMs custom chips added by the manufacturers? IIRC the classified has a ridiculously high number of phases, yet NV still pulled RMA support for it.)

Quote:
Originally Posted by Brutuz View Post

Performance gains for the next generation, if Kepler could hit 1.3-1.4Ghz then it'd be no doubt very able to match or beat the newer products..A lot of people saw this with the GTX 460 vs the 560.
Also, nVidia's stance at first was that it's fine to make new designs that OV but not to allow software OVing of reference designs, hence why the Lightning and Classified supported it in the first place..Why did they change this randomly? There's definitely electromigration issues (albeit way overblown by a lot of people, much like how everyone thought that AMD 45nm CPUs were only safe up to 1.425v for the longest time until so many people ran 1.55v 24/7 that it became obvious they could) but it's also an artificial limitation to help nVidia make the GTX 780 look more compelling.
The only issue I have with this argument is that we see it every generation, every single generation people claim they're artificially holding back to prevent the cards outperforming the next gen. Again, from a sheer business perspective it doesn't make sense unless you have a massive lead on the competition. And NVidia doesn't.

That's kind of my point, why would they change it randomly? It's obviously going to damage their sales, and it was never going to go down well PR-wise, they have no motivation to disallow it unless they believe there is an issue. Again, artificially limiting the cards does not make sense unless they are miles ahead of AMD, and they aren't.

Quote:
Originally Posted by Brutuz View Post

Nothing, but they are an example of what happens when you gimp on VRM designs too much.
I usually tend to disagree with them, however nVidia is nothing but the teenage girl of computer companies. (Remember Woodscrewgate? PhysX in 3DMark Vantage? GeForce FX failing so hard they started to do driver optimisations that reduced game quality so they could have higher performance? Not that AMD/ATI is innocent of that.)

Hehe, the FX range. That takes me back, I seem to remember flashing a 5900 into a 5950u, and it still sucked in dx9 games biggrin.gif
Serigatum
(13 items)
 
  
CPUMotherboardGraphicsRAM
4770k @ 4.7Ghz Asus P87 Deluxe EVGA GTX 780 Corsair Vengeance 2400Mhz 
Hard DriveHard DriveCoolingOS
Raid0: 2x512GB Samsung 840 Pro 2GB WD Black NZXT Kraken x60 Windows 8 
MonitorKeyboardPowerCase
Dell U2412M Razer Blackwidow Ultimate Corsair AX850 NZXT Phantom 820 
Mouse
Razer Ouroboros 
  hide details  
Reply
Serigatum
(13 items)
 
  
CPUMotherboardGraphicsRAM
4770k @ 4.7Ghz Asus P87 Deluxe EVGA GTX 780 Corsair Vengeance 2400Mhz 
Hard DriveHard DriveCoolingOS
Raid0: 2x512GB Samsung 840 Pro 2GB WD Black NZXT Kraken x60 Windows 8 
MonitorKeyboardPowerCase
Dell U2412M Razer Blackwidow Ultimate Corsair AX850 NZXT Phantom 820 
Mouse
Razer Ouroboros 
  hide details  
Reply
post #170 of 182
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

I want to know this too.
Actually, yes, I'm going to spend extra on an 80 Plus Platinum PSU because I know that I'll still be able to use it in 8 years and it ends up way cheaper. Motherboards? Not so much, but it definitely factors into my motherboard choice. (Hence why when I reassemble my main rig, I'll be running a Gigabyte 990FXA-UD3 instead of the ASRock.)
Hehe, I guess some people think further ahead than me then biggrin.gif I actually got a little annoyed my last rig lived as long as it did, but then I generally only have one or two builds running at the same time tongue.gif

I reuse old parts all of the time, so longevity is important to me and the entire second-hand parts market.
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

Go on, buy a Shaw 860w PSU, it's cheaper and still has the same wattage, plus I know of someone who ran one quite well with a i7 920 and HD5870, quality be damned, right? rolleyes.gif
Electromigration has zero to do with it, life-span does, if you get two 850w PSUs, one with ultra high-quality components and one with merely adequate components the adequate one will fail much sooner, ever heard of redundancy? The GTX 680 uses ~230w of power, if nVidia has only designed a VRM rated up to 250w then in 4-5 years that will be really pushing the VRMs to power the GPU at stock speeds, I prefer to keep spare GPUs rather than buying cheap low-end ones all the time.
Yes, it does, and you get degradation at the same time, the chip doesn't just die one day, if you could OV a 680 and it was affected by electromigration, then the stable 1375Mhz OC might go to 1360Mhz, then down to 1325Mhz...Why aren't I seeing reports of that on 680s that are OVed?
Overvolted 680s have only been around, what, 8 months? We also haven't seen many reports of VRMs failing on them, my point is only that it looks like NV is expecting this to become an issue in the future.

It does not make sense for NVidia to claim there is an issue with their product, if there isn't an issue with their product. Whether they say it's electromigration, or blame it on manufacturers pushing crap VRMs too far, makes little difference. The backlash is always going to be the same.

(Also, on the overvolted boards, aren't the VRMs custom chips added by the manufacturers? IIRC the classified has a ridiculously high number of phases, yet NV still pulled RMA support for it.)

Long enough to show electromigration, if the voltages eVGA says are safe (1.3v) are actually unsafe you'd see signs of it by now, they're trying to hide that there is one at all hence the electromigration, every enthusiasts that it's practically as set in stone as a law of physics..you can't beat it, whereas crappy VRMs can be improved.

They are on the OV capable boards, nVidia's original idea was that you had to have a custom PCB/parts and no software OVing..Then that changed randomly.
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

Performance gains for the next generation, if Kepler could hit 1.3-1.4Ghz then it'd be no doubt very able to match or beat the newer products..A lot of people saw this with the GTX 460 vs the 560.
Also, nVidia's stance at first was that it's fine to make new designs that OV but not to allow software OVing of reference designs, hence why the Lightning and Classified supported it in the first place..Why did they change this randomly? There's definitely electromigration issues (albeit way overblown by a lot of people, much like how everyone thought that AMD 45nm CPUs were only safe up to 1.425v for the longest time until so many people ran 1.55v 24/7 that it became obvious they could) but it's also an artificial limitation to help nVidia make the GTX 780 look more compelling.
The only issue I have with this argument is that we see it every generation, every single generation people claim they're artificially holding back to prevent the cards outperforming the next gen. Again, from a sheer business perspective it doesn't make sense unless you have a massive lead on the competition. And NVidia doesn't.

That's kind of my point, why would they change it randomly? It's obviously going to damage their sales, and it was never going to go down well PR-wise, they have no motivation to disallow it unless they believe there is an issue. Again, artificially limiting the cards does not make sense unless they are miles ahead of AMD, and they aren't.

nVidia technically does, the GTX 680 would originally have been a GTX 660, I doubt GK110 will be in the 780 but they could have always bought out GK110 for that extra performance they'd need to beat AMDs cards in performance, and I have no idea, I'm just stating facts, nVidia's current position is right here in this thread and their original position is somewhere in Anandtechs review of the 680 Classy.

They're within pissing distance of each other, plus nVidia's main market is Tesla, Tegra and Quadro these days.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [VRZ] Nvidia's Green Light Program Overclocking Limitations - Origins and Implications