Overclock.net › Forums › Industry News › Hardware News › [VRZ] Nvidia's Green Light Program Overclocking Limitations - Origins and Implications
New Posts  All Forums:Forum Nav:

[VRZ] Nvidia's Green Light Program Overclocking Limitations - Origins and Implications - Page 18

post #171 of 182
Quote:
Originally Posted by NihilOC View Post



Quite simply I believe them, because they have absolutely nothing to gain by lying.

But if you can find evidence contradicting the NVidia spokesperson, please, be my guest. Show it. I will ask you though, what the hell do NVidia have to gain by lying?


lachen.gif omg my new signature !!! lachen.gif
My System
(13 items)
 
  
MotherboardGraphicsRAM
ASUS M5A97 XFX 7770 Kingston HyperX  
  hide details  
Reply
My System
(13 items)
 
  
MotherboardGraphicsRAM
ASUS M5A97 XFX 7770 Kingston HyperX  
  hide details  
Reply
post #172 of 182
I honestly think that Nvidia knew from the start that if they allow unlocked voltage control it would hurt their next generation. The 7XX series would not be attractive due to small gains in performance. If im not going to push past 1.2vcore why would I spend more money on more expensive VRM's? Money wise it makes perfect sense. I dont think any one here can say that they are under built for their intended use. Mine is been running 1.21vcore @ 1300 mhz 24/7 folding since they came out and shows no issues.
Captain Crunch
(12 items)
 
  
CPUMotherboardGraphicsGraphics
3770k @ 4.6Ghz Asus Z77 Sabertooth EVGA GeForce GTX 660 Ti FTW+ 3GB w/Backplate (SLI) Waiting For GTX 780  
RAMHard DriveCoolingOS
Samsung 2133Mhz 10-10-10-28 1T OCZ SSD Corsair H100 P/P Windows 7 
MonitorKeyboardPowerCase
Monoprice 2560x1440p Ducky Shine Corsair AX1200 Cooler Master 690 
  hide details  
Reply
Captain Crunch
(12 items)
 
  
CPUMotherboardGraphicsGraphics
3770k @ 4.6Ghz Asus Z77 Sabertooth EVGA GeForce GTX 660 Ti FTW+ 3GB w/Backplate (SLI) Waiting For GTX 780  
RAMHard DriveCoolingOS
Samsung 2133Mhz 10-10-10-28 1T OCZ SSD Corsair H100 P/P Windows 7 
MonitorKeyboardPowerCase
Monoprice 2560x1440p Ducky Shine Corsair AX1200 Cooler Master 690 
  hide details  
Reply
post #173 of 182
Quote:
Originally Posted by Brutuz View Post

I reuse old parts all of the time, so longevity is important to me and the entire second-hand parts market.
Long enough to show electromigration, if the voltages eVGA says are safe (1.3v) are actually unsafe you'd see signs of it by now, they're trying to hide that there is one at all hence the electromigration, every enthusiasts that it's practically as set in stone as a law of physics..you can't beat it, whereas crappy VRMs can be improved.
Ahh, if true that actually explains what a lot of the fuss is about. I thought electromigration took a little longer to degrade chips (e.g. ~1-2 years).

Poor build quality on reference boards, or more accurately a reluctance to overbuilt, still doesn't explain NV's behaviour though. I think my main frustration for this entire debacle is the fact that it simply doesn't make sense. They've knowingly taken action that will hurt sales, this is simply anathema unless there is a genuine cause for concern. (O.K., knowing most companies, a semi-genuine cause would probably be sufficient biggrin.gif)

Quote:
Originally Posted by Brutuz View Post

They are on the OV capable boards, nVidia's original idea was that you had to have a custom PCB/parts and no software OVing... Then that changed randomly.
I'm honestly thinking that NV's position may have more to do with a dislike of EV bot than of overvolting itself. I mean, they limit software volting I presume because you cannot tell when a genuine failure has occured and when someone has software overvolted it. Plus I imagine it wouldn't be too difficult to change the UID on the bios and get the software to work on reference cards etc.

Maybe this is because they couldn't tell if EVGA classified cards were failing due to user negligence, and they were concerned it might lead to people taking advantage of the returns policy. It might have made them decide to revise greenlight, or enforce it a little more strictly (since we never knew the original terms of the greenlight agreement). And they certainly do seem to have a history of disliking plug-and-play OVing.

Either way, I still don't see it as an issue. Mostly just a mildly annoying curiosity. As far as I'm concerned, you can still overvolt via bios or by modding the card. And I'll probably go the BIOS route when it comes to it, there used to be entire communities set up for sharing modified GPU bios files, I presume they still exist? The fact that NVidia refuse to make it easy for us in the end makes little difference.

Quote:
Originally Posted by Brutuz View Post

nVidia technically does, the GTX 680 would originally have been a GTX 660, I doubt GK110 will be in the 780 but they could have always bought out GK110 for that extra performance they'd need to beat AMDs cards in performance, and I have no idea, I'm just stating facts, nVidia's current position is right here in this thread and their original position is somewhere in Anandtechs review of the 680 Classy.
They're within pissing distance of each other, plus nVidia's main market is Tesla, Tegra and Quadro these days.
But, if GK100 was viable, why did they go with GK104? And why would they need to artificially hold back the GK104 if they are genuinely that far ahead? If they really had a working GK100 chip, and simply decided not to use it, or had a viable GK110 then IMHO the product line up would like this:

GK104 > GK100/GK110 > Maxwell

There wouldn't be any need for voltage limitations on the GK104, and the fact that they haven't expressly forbidden it, and instead only removed RMA for cards supporting it, suggests there is some kind of genuine fear of RMAs.
Serigatum
(13 items)
 
  
CPUMotherboardGraphicsRAM
4770k @ 4.7Ghz Asus P87 Deluxe EVGA GTX 780 Corsair Vengeance 2400Mhz 
Hard DriveHard DriveCoolingOS
Raid0: 2x512GB Samsung 840 Pro 2GB WD Black NZXT Kraken x60 Windows 8 
MonitorKeyboardPowerCase
Dell U2412M Razer Blackwidow Ultimate Corsair AX850 NZXT Phantom 820 
Mouse
Razer Ouroboros 
  hide details  
Reply
Serigatum
(13 items)
 
  
CPUMotherboardGraphicsRAM
4770k @ 4.7Ghz Asus P87 Deluxe EVGA GTX 780 Corsair Vengeance 2400Mhz 
Hard DriveHard DriveCoolingOS
Raid0: 2x512GB Samsung 840 Pro 2GB WD Black NZXT Kraken x60 Windows 8 
MonitorKeyboardPowerCase
Dell U2412M Razer Blackwidow Ultimate Corsair AX850 NZXT Phantom 820 
Mouse
Razer Ouroboros 
  hide details  
Reply
post #174 of 182
Cost savings are well and good, but on a $500+ card geared towards enthusiasts you shouldn't be seeing cut corners. I'm perfectly cool with a budget card like the 650 getting 'just adequate' power delivery, but a board that was designed to have 5 phase power being cut down to save 5 bucks, while still charging half a grand for it is a distasteful money grab.

I wonder if Nvidia gimped the card because they knew the voltage would be locked, or locked the voltage because they knew they gimped the card. Chicken and egg.
post #175 of 182
Quote:
Originally Posted by Dyson Poindexter View Post

I wonder if Nvidia gimped the card because they knew the voltage would be locked, or locked the voltage because they knew they gimped the card. Chicken and egg.

Funny you should mention this. I was talking to my doctor about computers the other day. He said "I have the world's fastest graphics card, an Nvidia GTX 680 classified."

I was like, "did you manage to get a model that had overvolting on it? I heard it got discontinued"

He goes, "I didn't need to overclock it... It's not the bottleneck. There's no software in the world that it can't max out."

So what I'm trying to say here is, even the people who may have done overclocking for utilitarian reasons in the past are not really pressed to do it anymore. I believe Nvidia became aware of this, and along with their brand loyalty, realized that they could not only cut costs by reducing RMA's but also just go ahead and cut some other corners by not allowing overvolting in the mainstream at all. I mean someone said it... hehehit's not gonna stop the hardcore people from getting around that.

On the other hand, I feel the methods used to introduce such changes couldn't have been worse and I am ashamed at hearing some people say things like "you can't blame them they're running a business after all." I mean if it was a simple matter of a warranty that could have been remedied by just no longer offering warranties with the Vmod cards. Obviously seeing things like abrupt design alterations and rumors of cutting supply are evidence of extortion and it is a sad state of affairs when people see this as excusable. In any business. Many people are desensitized to hearing about it and it's too bad that even with alternatives the need to just "do what takes the least amount of thought" leads us towards blind loyalty to any product.

I'm done being depressed about my fellow consumers now.
The Mad Cow
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel Celeron G530 Gigabyte H61M-S2H HIS Radeon HD 6670 Kingston 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 ... 
Hard DriveOptical DriveCoolingOS
Western Digital Caviar Blue Samsung/Hitachi something or other assortment of leftovers from old systems hot-gl... Xubuntu 13.04 
MonitorKeyboardPowerCase
Viewsonic 17" CRT Chicony Curbside Antec VP 450 Hand-me-down Gateway case 
MouseMouse PadAudioOther
Can you believe I got it at a gas station? worth more than the software it came with generic, and yet still fit the color scheme Logitech Dual-action game controller 
  hide details  
Reply
The Mad Cow
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel Celeron G530 Gigabyte H61M-S2H HIS Radeon HD 6670 Kingston 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 ... 
Hard DriveOptical DriveCoolingOS
Western Digital Caviar Blue Samsung/Hitachi something or other assortment of leftovers from old systems hot-gl... Xubuntu 13.04 
MonitorKeyboardPowerCase
Viewsonic 17" CRT Chicony Curbside Antec VP 450 Hand-me-down Gateway case 
MouseMouse PadAudioOther
Can you believe I got it at a gas station? worth more than the software it came with generic, and yet still fit the color scheme Logitech Dual-action game controller 
  hide details  
Reply
post #176 of 182
My next gpu will not be Nvidia. Their actions in limiting consumer control over products and strong arming smaller companies is reminiscent of Apple.
Word Processor
(25 items)
 
  
CPUMotherboardGraphicsRAM
i5 2500k @ 4.5 Ghz 1.43v asrock z77 pro4-m Zotac GTX 780 2 x SAMSUNG 4GB 240-Pin DDR3 SDRAM DDR3 1600 @ ... 
Hard DriveHard DriveHard DriveHard Drive
Crucial M4 128 Samsund 840 Series Seagate ST300DM001-1CH166 Seagate ST4000DM000-1F2168 
Hard DriveCoolingCoolingOS
HITACHI Deskstar 5K3000  Cooler Master Hyper 212 Plus NZXT Sentry Mix Windows 7 64 bit 
MonitorMonitorKeyboardPower
SAMSUNG ToC Rose Black 27"  ACHIEVA Shimian QH270-IPSB Rosewill Cherry MX Brown SeaSonic X750 Gold 750W 
CaseMouseMouse PadAudio
Fractal Define XL Titanium Grey Anker High Precision Laser Gaming Mouse  XTRAC Pads Ripper XXL Audio-Technica ATH-M50 
AudioAudioAudioAudio
FiiO E10 Pioneer VSX-1021-K Energy Take 5 Classic BIC America F-12 
Audio
Pioneer SP-FS52-LR Andrew Jones Designed 
  hide details  
Reply
Word Processor
(25 items)
 
  
CPUMotherboardGraphicsRAM
i5 2500k @ 4.5 Ghz 1.43v asrock z77 pro4-m Zotac GTX 780 2 x SAMSUNG 4GB 240-Pin DDR3 SDRAM DDR3 1600 @ ... 
Hard DriveHard DriveHard DriveHard Drive
Crucial M4 128 Samsund 840 Series Seagate ST300DM001-1CH166 Seagate ST4000DM000-1F2168 
Hard DriveCoolingCoolingOS
HITACHI Deskstar 5K3000  Cooler Master Hyper 212 Plus NZXT Sentry Mix Windows 7 64 bit 
MonitorMonitorKeyboardPower
SAMSUNG ToC Rose Black 27"  ACHIEVA Shimian QH270-IPSB Rosewill Cherry MX Brown SeaSonic X750 Gold 750W 
CaseMouseMouse PadAudio
Fractal Define XL Titanium Grey Anker High Precision Laser Gaming Mouse  XTRAC Pads Ripper XXL Audio-Technica ATH-M50 
AudioAudioAudioAudio
FiiO E10 Pioneer VSX-1021-K Energy Take 5 Classic BIC America F-12 
Audio
Pioneer SP-FS52-LR Andrew Jones Designed 
  hide details  
Reply
post #177 of 182
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

I reuse old parts all of the time, so longevity is important to me and the entire second-hand parts market.
Long enough to show electromigration, if the voltages eVGA says are safe (1.3v) are actually unsafe you'd see signs of it by now, they're trying to hide that there is one at all hence the electromigration, every enthusiasts that it's practically as set in stone as a law of physics..you can't beat it, whereas crappy VRMs can be improved.
Ahh, if true that actually explains what a lot of the fuss is about. I thought electromigration took a little longer to degrade chips (e.g. ~1-2 years).

Poor build quality on reference boards, or more accurately a reluctance to overbuilt, still doesn't explain NV's behaviour though. I think my main frustration for this entire debacle is the fact that it simply doesn't make sense. They've knowingly taken action that will hurt sales, this is simply anathema unless there is a genuine cause for concern. (O.K., knowing most companies, a semi-genuine cause would probably be sufficient biggrin.gif)

We simply don't know enough to fully support argument to be honest, nVidia is keeping a lot secret. It's also better to acknowledge a small problem to hide a big problem than to just hide it and let the community work it out eventually, I guess.
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

They are on the OV capable boards, nVidia's original idea was that you had to have a custom PCB/parts and no software OVing... Then that changed randomly.
I'm honestly thinking that NV's position may have more to do with a dislike of EV bot than of overvolting itself. I mean, they limit software volting I presume because you cannot tell when a genuine failure has occured and when someone has software overvolted it. Plus I imagine it wouldn't be too difficult to change the UID on the bios and get the software to work on reference cards etc.

Maybe this is because they couldn't tell if EVGA classified cards were failing due to user negligence, and they were concerned it might lead to people taking advantage of the returns policy. It might have made them decide to revise greenlight, or enforce it a little more strictly (since we never knew the original terms of the greenlight agreement). And they certainly do seem to have a history of disliking plug-and-play OVing.

Either way, I still don't see it as an issue. Mostly just a mildly annoying curiosity. As far as I'm concerned, you can still overvolt via bios or by modding the card. And I'll probably go the BIOS route when it comes to it, there used to be entire communities set up for sharing modified GPU bios files, I presume they still exist? The fact that NVidia refuse to make it easy for us in the end makes little difference.

See, they could have a fuse for over x amount of volts if they really wanted to have OV and keep sales which is why I don't get it, it wouldn't even need to be on the GPU but a separate part of the card that all other OEMs are required to implement much like how phones tend to know when you've flashed custom firmware even if you have flashed the stock firmware.
Quote:
Originally Posted by NihilOC View Post

Quote:
Originally Posted by Brutuz View Post

nVidia technically does, the GTX 680 would originally have been a GTX 660, I doubt GK110 will be in the 780 but they could have always bought out GK110 for that extra performance they'd need to beat AMDs cards in performance, and I have no idea, I'm just stating facts, nVidia's current position is right here in this thread and their original position is somewhere in Anandtechs review of the 680 Classy.
They're within pissing distance of each other, plus nVidia's main market is Tesla, Tegra and Quadro these days.
But, if GK100 was viable, why did they go with GK104? And why would they need to artificially hold back the GK104 if they are genuinely that far ahead? If they really had a working GK100 chip, and simply decided not to use it, or had a viable GK110 then IMHO the product line up would like this:

GK104 > GK100/GK110 > Maxwell

There wouldn't be any need for voltage limitations on the GK104, and the fact that they haven't expressly forbidden it, and instead only removed RMA for cards supporting it, suggests there is some kind of genuine fear of RMAs.

GK100 wasn't at first, but GK110 is now (Hence why there's a Tesla with one inside).
It also looks better when they have a slower GPU for now that is merely adequate and then make an awesome GPU later, think like how the HD4870 was so good not only because it well, just owned, but because ATIs previous offerings weren't so good
post #178 of 182
Everyone who complains about this better boycott the 700 series if Nvidia continues with Green Light.

I don't want to see any of you goofballs ***** about it then go balls deep in the 700 series biggrin.gif

Personally, Nvidia's decision to lock voltages is bad for enthusiasts, but it doesn't affect me since I don't over-volt, so I might actually buy one biggrin.gif
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
[Intel Core i7 920 @ 4.2 GHZ HT ON] [Asus P6T] GTX 280 [12GB Corsair Vengeance] 
Hard DriveOSMonitorKeyboard
[OCZ Vertex 3 240GB 3x2TB WDBlack 2x1.5TB Seagate] [Windows 7 Professional 64-bit] [Samsung SyncMaster P2770FH 27" 1080P] [Logitech MK320 cordless] 
PowerCaseMouse
[Corsair AX1200] [Corsair Obsidian 800D] [Logitech MK300 cordless] 
  hide details  
Reply
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
[Intel Core i7 920 @ 4.2 GHZ HT ON] [Asus P6T] GTX 280 [12GB Corsair Vengeance] 
Hard DriveOSMonitorKeyboard
[OCZ Vertex 3 240GB 3x2TB WDBlack 2x1.5TB Seagate] [Windows 7 Professional 64-bit] [Samsung SyncMaster P2770FH 27" 1080P] [Logitech MK320 cordless] 
PowerCaseMouse
[Corsair AX1200] [Corsair Obsidian 800D] [Logitech MK300 cordless] 
  hide details  
Reply
post #179 of 182
Quote:
Originally Posted by i7monkey View Post

Everyone who complains about this better boycott the 700 series if Nvidia continues with Green Light.

I don't want to see any of you goofballs ***** about it then go balls deep in the 700 series biggrin.gif

Personally, Nvidia's decision to lock voltages is bad for enthusiasts, but it doesn't affect me since I don't over-volt, so I might actually buy one biggrin.gif

I'm just going to count OCed results as per usual, a GTX 680 likely would be faster than a HD7970 Ghz edition if it could OV and hit 1.35Ghz. I'm likely going to buy around the $250 mark, whichever card is fastest at fully OCed scenarios is the one I'll get.
post #180 of 182
Quote:
Originally Posted by Brutuz View Post

We simply don't know enough to fully support argument to be honest, nVidia is keeping a lot secret. It's also better to acknowledge a small problem to hide a big problem than to just hide it and let the community work it out eventually, I guess.

True, it is a niggling curiosity though, if they are keeping something secret, what is it that they'd knowingly let all this internet flaming go on for? You would have thought they'd just release stats of RMAs, or some of the lab results for their stress tests, and say "****, internet".

Quote:
Originally Posted by Brutuz View Post

See, they could have a fuse for over x amount of volts if they really wanted to have OV and keep sales which is why I don't get it, it wouldn't even need to be on the GPU but a separate part of the card that all other OEMs are required to implement much like how phones tend to know when you've flashed custom firmware even if you have flashed the stock firmware.

I've always wondered about that, maybe their concern is about such a move generating false positives? If it ever turned out that genuine card faults, or cruddy VRMs, could cause the fuses to blow then the internet ****storm would be immense.

Locking the firmware would probably be a bit of a douche move too, if they ever needed to release an emergency updated version like they have in the past. Unless they had a chip that stored a hash of the BIOS files flashed or something.

Either way, if they could get this kind of thing working it would be pretty awesome. I love the idea of plug-and-play style overvolting, being able to unlock it (even if it involves re-flashing initially), would probably make me buy NV until the end of time. Well, I'd buy it until AMD copy them anyway biggrin.gif
Quote:
Originally Posted by Brutuz View Post

GK100 wasn't at first, but GK110 is now (Hence why there's a Tesla with one inside).
It also looks better when they have a slower GPU for now that is merely adequate and then make an awesome GPU later, think like how the HD4870 was so good not only because it well, just owned, but because ATIs previous offerings weren't so good

Tbh I don't think we're going to see an "awesome" GPU again until Maxwell, but I would question whether or not GK110 is viable as a consumer product. Bearing in mind that they have to keep inside certain TDP limits, and the sheer size of the die is ridiculous. And the I don't know what the yield is like, but I'm assuming it would be more cost effective to put even very badly binned parts into low end workstation cards.
Serigatum
(13 items)
 
  
CPUMotherboardGraphicsRAM
4770k @ 4.7Ghz Asus P87 Deluxe EVGA GTX 780 Corsair Vengeance 2400Mhz 
Hard DriveHard DriveCoolingOS
Raid0: 2x512GB Samsung 840 Pro 2GB WD Black NZXT Kraken x60 Windows 8 
MonitorKeyboardPowerCase
Dell U2412M Razer Blackwidow Ultimate Corsair AX850 NZXT Phantom 820 
Mouse
Razer Ouroboros 
  hide details  
Reply
Serigatum
(13 items)
 
  
CPUMotherboardGraphicsRAM
4770k @ 4.7Ghz Asus P87 Deluxe EVGA GTX 780 Corsair Vengeance 2400Mhz 
Hard DriveHard DriveCoolingOS
Raid0: 2x512GB Samsung 840 Pro 2GB WD Black NZXT Kraken x60 Windows 8 
MonitorKeyboardPowerCase
Dell U2412M Razer Blackwidow Ultimate Corsair AX850 NZXT Phantom 820 
Mouse
Razer Ouroboros 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [VRZ] Nvidia's Green Light Program Overclocking Limitations - Origins and Implications