Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA GTX 590 Owners Club
New Posts  All Forums:Forum Nav:

NVIDIA GTX 590 Owners Club - Page 419

post #4181 of 5154
Quote:
Originally Posted by Shinobi Jedi View Post

I'm in the same boat.
I'd still like to know if upping the voltages while keeping stock clocks in gaming are worth it, and/or improve performance? Because if not, then I'll just leave things be.

i reflashed the cards, reinstalled the drivers and precisionX and everything is good now
post #4182 of 5154
Masked, if my 590 was on water... what voltage would you recommend? Something that could push some good Mhz but not blow it up preferably. thumb.gif
My System
(14 items)
 
  
CPUMotherboardGraphicsRAM
i5 6600k Gigabyte Z170X-UD3 GTX 1080 8 GB Corsair Vengeance 
Hard DriveOSMonitorKeyboard
Crucial M4 128GB SSD Windows 7 64 Bit Crossover 27Q ( 2560x1440) Corsair Mechanical K70 LUX ( Cherry Blue ) 
PowerCaseMouse
Corsair TX850M Corsair SPEC-ALPHA Razer Naga 
  hide details  
Reply
My System
(14 items)
 
  
CPUMotherboardGraphicsRAM
i5 6600k Gigabyte Z170X-UD3 GTX 1080 8 GB Corsair Vengeance 
Hard DriveOSMonitorKeyboard
Crucial M4 128GB SSD Windows 7 64 Bit Crossover 27Q ( 2560x1440) Corsair Mechanical K70 LUX ( Cherry Blue ) 
PowerCaseMouse
Corsair TX850M Corsair SPEC-ALPHA Razer Naga 
  hide details  
Reply
post #4183 of 5154
Quote:
Originally Posted by ProfeZZor X View Post

This weekend I hooked up my wiring after flushing out all of my water cooling hardware, and the last thing I connected was my 590. Needless to say, after opening up the packaging I wasn't too happy with the length of the cables evga supplied, nor was I happy with the yellow wires sticking out in the two 8-pin connectors.
Does anyone know if there's a longer (matching pin) alternative 8-pin cable or extended power cables for the 590 out there? I tried the extended 8-pin at Fry's, but the pins didn't match.

I know that Lutr0 could make you some if you wanted...

I might be able to but, I'd need the connectors...I do have the sleeving, though...Hrm.
Quote:
Originally Posted by Xraze View Post

Masked, if my 590 was on water... what voltage would you recommend? Something that could push some good Mhz but not blow it up preferably. thumb.gif

I'm still kind of hot/bothered over the fact that someone tried to butcher Ohm's law in regards to "water cooling"...Really?

Temperature creates no resistance or acceleration on a graphics card because it doesn't thermally reach a point in either direction that it would have //ANY// effect on the voltage transfer...

Obviously, Joule's first law applies in relation to the thermal conductivity producing heat but, again, this has 0 effect on the load the VRM's can handle...It doesn't mean if watercooled they can handle more...That's absurd.

I'd have a good debate with you about acceleration/resistance on N2 but, in no/way/shape/form is that card STOCK at 1.06 -- No...Kingpin had to use an EVbot to legitimately BOOST the voltage and bypass the VRM's altogether.

Since we've had this discussion at least 20x, I'll repeat what I can say...If there's something I shouldn't have said, PM me and I'll edit it.

When this card was released, within a day, before it even hit testers labs, Nvidia discovered the VRM load was frying cards -- Actually, this was discovered in Feb but, only thoroughly tested on sample release...I was among the first to get samples (thank you UPS) and immediately following, 3 emails showed up in my box...All 3 stated the maximum SAFE voltage...It's 1.05...~ That's not 1.06v or 1.052 or 1.051, that's 1.050v.

Sweclockers had their cards at 1.2v ~ We've all seen the Vrms POP like a bag of popcorn ~ They just can't handle the load transfer at //ANYTHING// over 1.05v...

Like I said, I've blown up 3 of them...Sadly...And it was because the 1st core was overvolting (something I mentioned in a past post as well)...

While your primary core may be at 1.02v, there is a +/- .03 fluctuation, especially under load ~ Again, it is IMPOSSIBLE via Ohm's law for a +/- 10c change in temperature to effect the voltage transfer in this situation...So NO, watercooling does NOT prevent this from happening NOR does it raise your voltage roof.

When overclocking the card, that +/- 0.3 does change...Why? You're now demanding more load which, demands more power...Again, at 30c - 50c there is absolutely NO CHANGE to the amount of transfer the VRM's can handle.

If you're going to overclock this card, I would not SAFELY go over 0.95v -- Why?

It allows for max transfer, VRM load and overall, ensures you're not going to blow your card up.

I've blown 1 card at 1.00v because the load at 800mhz shot up over capacity and "snap crackle pop"...The other blew at 0.99v ~ and they actually blew within a week of each-other. ~ When I later tested the NEW cards under load, 1.00v became 1.05v on the 2nd core...The load was just unreal...

So realistically, as I said, your load on the hardware can fluctuate and I wouldn't go over 0.95 ~ Again, it's your card...Your investment...Ultimately your decision...But, don't be ignorant and think that water cooling "dissipates voltage load" because that's just about the dumbest thing I've ever heard on this forum.
Edited by Masked - 4/16/12 at 7:01am
post #4184 of 5154
Quote:
Originally Posted by Masked View Post

I just got off the phone w/my buddy at Nvidia and considering an entire driver-recreation would be necessary for this to happen and a custom SLI-Link...I'm calling 100% shens...Total BS.
I want pictures...You're a new member, I've tried this at Alienware and failed...Prove to me it's possible.
Not through DirectX because even a noob can jack their scores -- I want picture perfect proof that the SLI is enabled VIA Bios/Driver and proof of the SLI working in co-op.
Keep in mind, for this to work -- You'd need an entirely custom DX as well since tri-sli 590's isn't supported at the core of the programming.

This should show something, four screen shots and four runs total with vantage and 3dmark 11. 3way and 4way runs in both, but had to clock down to 650/1300/1800 for the quad run in 3dmark 11 today, it ran the other day at 700 so i will have to investigate why not today so the 3dmark 11 run is slightly less then 16k in quad. (screen shots attached)

3way sli was done by dedicating 2-a in control panel to do physics

vantage both runs
vantage runs.png 340k .png file


3dmark 11 both runs but quad at 650/1300/1800, 3 way at 700/1400/1800
3dmark11both runs.png 341k .png file


screenshot nvidia driver 3-way sli.png 504k .png file
screenshot quad sli.png 1536k .png file
Edited by mywifeispist - 4/16/12 at 7:45am
post #4185 of 5154
Quote:
Originally Posted by mywifeispist View Post

This should show something, two screen shots and four runs total with vantage and 3dmark 11. 3way and 4way runs in both, but had to clock down to 650/1300/1800 for the quad run in 3dmark 11 today, it ran the other day at 700 so i will have to investigate why not today so the 3dmark 11 run is slightly less then 16k in quad. (screen shots attached)
3way sli was done by dedicating 2-a in control panel to do physics
vantage both runs
http://3dmark.com/compare/3dmv/4027517/3dmv/4027499
3dmark 11 both runs but quad at 650/1300/1800, 3 way at 700/1400/1800
http://3dmark.com/compare/3dm11/3207711/3dm11/3207598
screenshot nvidia driver 3-way sli.png 504k .png file
screenshot quad sli.png 1536k .png file

That's NOT tri-SLI.

Tri-SLI is 3 cards in a row -- Thus, Tri-SLI.

On a dual-PCB the SLI is understood.

So you have SLI 590's ~ Period.

Re-read what you've said before and now understand, why everyone in here is/was confused...

And FYI, just because 1 core is dedicated to Phys-x, it still operates under the assumption that SLI is enabled, it just also handles computing tasks...So you're always active with 4 cores, period ~ Regardless of what you enable.

The only way to CHANGE that would be at the Bios level with a custom control panel...So, you have QUAD SLI with 1 core dedicated to Physx. Nvidia control panel works by card...Not by core, even though it may appear to do so.

I also, really don't care about screenshots ~ I've edited so many things, doctoring a DX result is literally child's play.
post #4186 of 5154
dont know whats so confusing here, not trying to argue at all really. all im saying here is all the benchmarks see 3gpu's and when i dont dedicate one for physics it enables all 4 for quad, and the scores are different ..... sounds simple to me.

all i said in the beginning was i run in 3 way mode with the single 1080p plasma, and when i ran 3x 27 inch monitors i enabled all 4 for 5760x1080 res., ....... using all 4 gpu's with one 1080p screen bottlenecks so i simply use one for physics and it says 3 way right in the driver and bottleneck is gone and get super smooth game play with high frames, dont know what is so confusing here.
post #4187 of 5154
Quote:
Originally Posted by Masked View Post

That's NOT tri-SLI.
Tri-SLI is 3 cards in a row -- Thus, Tri-SLI.
On a dual-PCB the SLI is understood.
So you have SLI 590's ~ Period.
Re-read what you've said before and now understand, why everyone in here is/was confused...
And FYI, just because 1 core is dedicated to Phys-x, it still operates under the assumption that SLI is enabled, it just also handles computing tasks...So you're always active with 4 cores, period ~ Regardless of what you enable.
The only way to CHANGE that would be at the Bios level with a custom control panel...So, you have QUAD SLI with 1 core dedicated to Physx. Nvidia control panel works by card...Not by core, even though it may appear to do so.
I also, really don't care about screenshots ~ I've edited so many things, doctoring a DX result is literally child's play.

why would i edit it...... doesnt make sense to do so
post #4188 of 5154
Quote:
Originally Posted by mywifeispist View Post

dont know whats so confusing here, not trying to argue at all really. all im saying here is all the benchmarks see 3gpu's and when i dont dedicate one for physics it enables all 4 for quad, and the scores are different ..... sounds simple to me.
all i said in the beginning was i run in 3 way mode with the single 1080p plasma, and when i ran 3x 27 inch monitors i enabled all 4 for 5760x1080 res., ....... using all 4 gpu's with one 1080p screen bottlenecks so i simply use one for physics and it says 3 way right in the driver and bottleneck is gone and get super smooth game play with high frames, dont know what is so confusing here.
Quote:
Originally Posted by mywifeispist View Post

why would i edit it...... doesnt make sense to do so

GPU stands for Visual Graphical Processing UNIT.

You don't have 4 GPU's, you have 2.

When you say you have tri-sli GPUs, it means, in THIS WORLD, that you have 3 physical GTX 590's in Tri-Sli which, you in fact do not.

You have 2 gpu's in SLI.

...I'm not about to go through a year of bugs/issues with you...It's not worth my time.

The simple fact is that if you have SLI enabled, all 4 cores are in SLI congruence, period...Even if 1 is dedicated for Phys-x ~ Your scores are different because the CPU load is off and it enables graphical processing from the CPU core + Since the 590 is actually moderately optimized, it boosts your score...You're still in SLI on all 4 cores.

Again, it's confusing because you're using the wrong terminology...And you're coming to a thread where everyone has been using the correct terminology for over a year...Confusing? Absolutely.

At work, I had the 590's in SLI on 3x40's and they performed, wonderfully...At home, I eventually gave up with SLI 590's and went with 1x590 and 1x580 ~ Was more than fine.
post #4189 of 5154
Quote:
Originally Posted by Masked View Post

GPU stands for Visual Graphical Processing UNIT.
You don't have 4 GPU's, you have 2.
When you say you have tri-sli GPUs, it means, in THIS WORLD, that you have 3 physical GTX 590's in Tri-Sli which, you in fact do not.
You have 2 gpu's in SLI.
...I'm not about to go through a year of bugs/issues with you...It's not worth my time.
The simple fact is that if you have SLI enabled, all 4 cores are in SLI congruence, period...Even if 1 is dedicated for Phys-x ~ Your scores are different because the CPU load is off and it enables graphical processing from the CPU core + Since the 590 is actually moderately optimized, it boosts your score...You're still in SLI on all 4 cores.
Again, it's confusing because you're using the wrong terminology...And you're coming to a thread where everyone has been using the correct terminology for over a year...Confusing? Absolutely.
At work, I had the 590's in SLI on 3x40's and they performed, wonderfully...At home, I eventually gave up with SLI 590's and went with 1x590 and 1x580 ~ Was more than fine.

actually im not..... just reading what the driver says period........ i guess nvidia is wrong with there terminology or wording then, i called it 3 way like they did, am i right? so how am i using wrong terminology ?
im not upset by any means and im not the one throwing insults here either, just responded and posted what i see thats all.
screenshot nvidia driver 3-way sli.png 504k .png file

i think you think im arguing with you and its not the case here....... this is what nvida puts in there control panel thats all....3way sli
and yes i understand what a gpu is and understand what you mean by 590's in sli, yes it is two physical cards each card being dual gpu
Edited by mywifeispist - 4/16/12 at 9:33am
post #4190 of 5154
Quote:
Originally Posted by Masked View Post

GPU stands for Visual Graphical Processing UNIT.
You don't have 4 GPU's, you have 2.
When you say you have tri-sli GPUs, it means, in THIS WORLD, that you have 3 physical GTX 590's in Tri-Sli which, you in fact do not.
You have 2 gpu's in SLI.
...I'm not about to go through a year of bugs/issues with you...It's not worth my time.
The simple fact is that if you have SLI enabled, all 4 cores are in SLI congruence, period...Even if 1 is dedicated for Phys-x ~ Your scores are different because the CPU load is off and it enables graphical processing from the CPU core + Since the 590 is actually moderately optimized, it boosts your score...You're still in SLI on all 4 cores.
Again, it's confusing because you're using the wrong terminology...And you're coming to a thread where everyone has been using the correct terminology for over a year...Confusing? Absolutely.
At work, I had the 590's in SLI on 3x40's and they performed, wonderfully...At home, I eventually gave up with SLI 590's and went with 1x590 and 1x580 ~ Was more than fine.

1 590 + 1 580? Do you mean the 580 for physics?
My System
(14 items)
 
  
CPUMotherboardGraphicsRAM
i5 6600k Gigabyte Z170X-UD3 GTX 1080 8 GB Corsair Vengeance 
Hard DriveOSMonitorKeyboard
Crucial M4 128GB SSD Windows 7 64 Bit Crossover 27Q ( 2560x1440) Corsair Mechanical K70 LUX ( Cherry Blue ) 
PowerCaseMouse
Corsair TX850M Corsair SPEC-ALPHA Razer Naga 
  hide details  
Reply
My System
(14 items)
 
  
CPUMotherboardGraphicsRAM
i5 6600k Gigabyte Z170X-UD3 GTX 1080 8 GB Corsair Vengeance 
Hard DriveOSMonitorKeyboard
Crucial M4 128GB SSD Windows 7 64 Bit Crossover 27Q ( 2560x1440) Corsair Mechanical K70 LUX ( Cherry Blue ) 
PowerCaseMouse
Corsair TX850M Corsair SPEC-ALPHA Razer Naga 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA GTX 590 Owners Club