Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA Drivers and Overclocking Software › GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ┌(ô益ô)┐
New Posts  All Forums:Forum Nav:

GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ┌(ô益ô)┐ - Page 355

post #3541 of 7277
Quote:
Originally Posted by Mr-Dark View Post

Thanks +rep.

I see its meet what i need, so Pair of those on the way thumb.gif

any info about the memory on those ? I read all 980 Ti Xtreme has Samsung memory same for non Ti ?

By the way i'm changing my PSU, the current EVGA supernova 850 B2, that will not enough a pair of 980 @1.5ghz 1.275v & 5820k @4.6ghz 1.32v & 8 stick of memory & dual Samsung 950 pro & 4 SSD's and around 10 corsair fan's biggrin.gif

looking for the Evga SuperNova 1300w G2 Gold PSU will that enough or 1600w is better ?

I'm using a corsair 850w, on my 2nd lot of sli 980 ti's and its held up fine, I never see it pulling more than 750w from the wall even with a oc'd cpu and cards maxed out on gpu and ram with a game running full wack.

A game with 80% gpu usage on aveage and 4-5gb gpu ram used I see usually around 550 watts being pulled in game.
The Beast Master
(16 items)
 
  
CPUMotherboardGraphicsRAM
5930k Gigabyte x99 gaming p sli Gigabyte gtx 980 ti xtreme gaming 4x4gb Kingston fury 
Hard DriveOptical DriveCoolingOS
250gb samsung evo Lg Sata Corsair 110i gtx Win10 64Bit 
MonitorPowerCaseAudio
Sony 4K 8505 49" corsair tx850 v2 HafX creative x-fi titanium,dts and ddl, z6a 5.1 hea... 
  hide details  
Reply
The Beast Master
(16 items)
 
  
CPUMotherboardGraphicsRAM
5930k Gigabyte x99 gaming p sli Gigabyte gtx 980 ti xtreme gaming 4x4gb Kingston fury 
Hard DriveOptical DriveCoolingOS
250gb samsung evo Lg Sata Corsair 110i gtx Win10 64Bit 
MonitorPowerCaseAudio
Sony 4K 8505 49" corsair tx850 v2 HafX creative x-fi titanium,dts and ddl, z6a 5.1 hea... 
  hide details  
Reply
post #3542 of 7277
Quote:
Originally Posted by Laithan View Post

I'm not going to recommend that you do, especially with that ASIC% (which isn't a bad one). I posted those 1.312v BIOS' because people kept asking for them (other threads were using 1.312v) and they MAY come in handy for those with a really low ASIC% to squeeze a little bit more O/C headroom. I've said from the very beginning that I saw NO DIFFERENCE from using 1.281v compared to using 1.312v with GM204. Temps basically just went up and for me I felt I had less stability. I see other threads started backing down to 1.281v also so I really cannot recommend it. I leave it there for those that dig for it. If you wish to travel to the dark side be warned! skull.gif


Something, something dark side...
^ Mandatory homework for this thread... Ahemm
Ok you're on the right track now. Raise GPU CORE +10Mhz at a time until you TDR (crash) and then back it down at least 10-20Mhz lower for stability YMMV. Let us know how it works tongue.gif
Good luck!
Replacing the factory thermal compound and heat pads IMO is always a good idea. Just need to do it RIGHT. Do NOT use compound that is electrically conductive (like Arctic silver 5 is). And make sure you get the right thickness for thermal pads otherwise it can prevent things from sitting properly across the entire GPU.

With (2) GPUs in SLI, your case will most likely need to be left open at all times. Clean up wires, dust, etc. you just doubled your heat management issues with a 2nd GPU tongue.gif

When you have (2) GPUs in SLI, even if they are using the "SLOT 1 & 3" combination where they are further apart, the TOP GPU is ALWAYS going to be considerably hotter under load. Heat from the backplate of the bottom GPU will be sucked right up into the TOP GPU. I have also seen a 5C-10C difference at times although it was usually closer to more like 5C. This is perfectly "normal" but definitely not desired. Keep in mind that these Windforce cooler designs eject all the HOT air right back into the case and INTAKE all of that same hot air it is up to us to remove the hot air, bring in cool air and we need to be "forceful" to remove the heat at a high enough rate of speed (higher than it is being created).

When you MOD a GPU you are expecting to have to cool it a little better. A custom fan curve is pretty much MANDATORY.. and SUPPLEMENTAL cooling is also HIGHLY recommended. If I may re-post Chemical world's PICS, it shows an excellent job of someone who is tackling the GPU heat issue properly and beautifully. When you are dealing with GM200, HEAT will be your #1 challenge which is why water blocks are so desired for them.

Note the 80mm fan on TOP of the GPU backplate with tiewraps as little "feet" to allow air to escape? Makes a HUGE diff. His temps between the GPUs should be close to identical now under load.



As far as your voltages with the STOCK BIOS, this is completely normal and as you pointed out has to do with your ASIC% value. When using a MOD BIOS voltages are still automatically determined according to your ASIC% values in the lower CLKs but as you get up into the higher CLKs the voltage range becomes more and more narrow until CLK74 where the voltage is specified and no longer a variable. Voltage fluctuation under full load causes throttling. Not all GPUs are created equal and basically you are going to want to run BOTH GPU's at the same exact BOOST speeds under load and I would't worry about customizing the voltage for each GPU unless you have DRASTICALLY difference in ASIC% value between them. You are going to likely run into one of the GPUs maximum overclock ability and/or a heat restriction before you run into a voltage restriction when on AIR.

Thanks for the info, that's a nice system you have there.

Luckily the hotter card is almost the same as the cooler card in sli so its not actually too bad.

I got some great airflow and some demciflex filters rocking my case so it's all good bro.

I've had quite a lot of sli cards over the years, been great fun trying to get the temps down.
The Beast Master
(16 items)
 
  
CPUMotherboardGraphicsRAM
5930k Gigabyte x99 gaming p sli Gigabyte gtx 980 ti xtreme gaming 4x4gb Kingston fury 
Hard DriveOptical DriveCoolingOS
250gb samsung evo Lg Sata Corsair 110i gtx Win10 64Bit 
MonitorPowerCaseAudio
Sony 4K 8505 49" corsair tx850 v2 HafX creative x-fi titanium,dts and ddl, z6a 5.1 hea... 
  hide details  
Reply
The Beast Master
(16 items)
 
  
CPUMotherboardGraphicsRAM
5930k Gigabyte x99 gaming p sli Gigabyte gtx 980 ti xtreme gaming 4x4gb Kingston fury 
Hard DriveOptical DriveCoolingOS
250gb samsung evo Lg Sata Corsair 110i gtx Win10 64Bit 
MonitorPowerCaseAudio
Sony 4K 8505 49" corsair tx850 v2 HafX creative x-fi titanium,dts and ddl, z6a 5.1 hea... 
  hide details  
Reply
post #3543 of 7277
Thread Starter 
Quote:
Originally Posted by OcSlave View Post

I'm using a corsair 850w, on my 2nd lot of sli 980 ti's and its held up fine, I never see it pulling more than 750w from the wall even with a oc'd cpu and cards maxed out on gpu and ram with a game running full wack.

A game with 80% gpu usage on aveage and 4-5gb gpu ram used I see usually around 550 watts being pulled in game.

Try running FFXIV free benchmark @ 4k MAXIMUM preset and see what you get wink.gif

My system is a power hog so this is SLI in a worse case scenario but it is possible.




Quote:
Originally Posted by OcSlave View Post

Thanks for the info, that's a nice system you have there.

Luckily the hotter card is almost the same as the cooler card in sli so its not actually too bad.

I got some great airflow and some demciflex filters rocking my case so it's all good bro.

I've had quite a lot of sli cards over the years, been great fun trying to get the temps down.

That's @ChemicalWorld PC but I am sure he would appreciate it smile.gif

This is my PO$ wink.gif

Leak testing
post #3544 of 7277
Thread Starter 
post #3545 of 7277
Quote:
Originally Posted by Laithan View Post

Try running FFXIV free benchmark @ 4k MAXIMUM preset and see what you get wink.gif

My system is a power hog so this is SLI in a worse case scenario but it is possible.

Wow that bench looks really pretty at 4k, max I seen my meter go was 729w and that was only for a brief moment, otherwise it hung around 650w being pulled from wall.
The Beast Master
(16 items)
 
  
CPUMotherboardGraphicsRAM
5930k Gigabyte x99 gaming p sli Gigabyte gtx 980 ti xtreme gaming 4x4gb Kingston fury 
Hard DriveOptical DriveCoolingOS
250gb samsung evo Lg Sata Corsair 110i gtx Win10 64Bit 
MonitorPowerCaseAudio
Sony 4K 8505 49" corsair tx850 v2 HafX creative x-fi titanium,dts and ddl, z6a 5.1 hea... 
  hide details  
Reply
The Beast Master
(16 items)
 
  
CPUMotherboardGraphicsRAM
5930k Gigabyte x99 gaming p sli Gigabyte gtx 980 ti xtreme gaming 4x4gb Kingston fury 
Hard DriveOptical DriveCoolingOS
250gb samsung evo Lg Sata Corsair 110i gtx Win10 64Bit 
MonitorPowerCaseAudio
Sony 4K 8505 49" corsair tx850 v2 HafX creative x-fi titanium,dts and ddl, z6a 5.1 hea... 
  hide details  
Reply
post #3546 of 7277
I downloaded this bench to try it and on 1455 Mhz (default) i got 5822 score very high on 4k Maximum, is that ok? Temp was 73-74 max
post #3547 of 7277
Thread Starter 
Quote:
Originally Posted by OcSlave View Post

Wow that bench looks really pretty at 4k, max I seen my meter go was 729w and that was only for a brief moment, otherwise it hung around 650w being pulled from wall.

That's pretty impressive your system is very power efficient (I'm jelly). At least Intel did something the past 8 years tongue.gif (dig).

Mine is ugly... see below, 1252W max (I just ran this test) tongue.gif I noticed there was a new version of Furmark (1.17.0.0) so I gave it a whirl to see how things were with RC2. I just pulled more power than ever before tongue.gif.




Some details (aka blab) for the audience:
I have a system idle draw of 411W so let's just assume to make it simple that my GPUs @ idle are using zero power (which they of course are using some). If I take the Maximum power draw of 1252W and subtract the entire 411W of idle power draw that gives me 789W of power draw that is pretty much all GPU and really an under-estimate since I'm including no power at all from an idle state. It does give me an idea of what both of my GPUs are using for power since Furmark is a GPU power virus. 789W for BOTH GPUs = 394.5W (We'll say 400W) each GPU which still may be slight under estimated (if we included idle power draw).

Since Furmark doesn't stress all cores of the CPU that much (it varies a lot) there is additional power consumption to be included in the total power supply calculation of the entire system. When the conditions of BOTH the CPU and GPU being fully utilized under heavy load (basically add the worse case scenario with regard to power consumption for BOTH the CPU and the GPU together, that would be your "TRUE" maximum overall potential power consumption. Take that value and add 20%. That's your recommend power supply size IMO.

So with that in mind, I was able to pull 1252W and all of CPU cores were not even under full load!

I want others to see that it is possible to pull a lot of power so they don't under-estimate. The true answer of what power supply you need depends on your system, the components within and the overclock you have on your stuff. I think we should try not to exceed 80% of the power supply's maximum wattage just for efficiency (unless you've got a gold/platinum 80+ rating) where that may not be as important. Conservative estimates are always best when dealing with power IMHO.

So there is overall system power draw and there is just GPU BIOS draw (MOD PSU recommendations). The 980Ti GPU BIOS MODs here are rated at 450W each which is about 50W "more than needed" according to the estimates above with furmark however that was also not including idle power draw.. The 450W rating for the GPU itself using the MOD BIOS is slightly conservative indeed but not by a lot when you factor all the details in addition to a goal of not exceeding 80% of your PSU's total wattage. I have some info in the OP about this also.

PS. The newer versions of Furmark now supports SLI which is why my Furmark results (single GPU) were not using as much power.

skull.gif

Quote:
Originally Posted by NikolayNeykov View Post

I downloaded this bench to try it and on 1455 Mhz (default) i got 5822 score very high on 4k Maximum, is that ok? Temp was 73-74 max


Very good, much better than my score wink.gif
post #3548 of 7277
Can you Laithan take a look at this 680GTX Lightning BIOS?
The power table shows 600 watt in all the tables in there, TDP, Rails, PCI slot, power limit, everything.
I am using this BIOS for my 680 Lightning (in my other PC) and I have no problem with it. works fine there (for sure it's an official BIOS from MSI not tempered with or modded, and it's fully unlocked). Link to topic also from here:
http://www.overclock.net/t/1280007/official-msi-gtx-680-lightning-owners-club/0_100

Is it safe to assume even if you put 600 watt in all fields would not cause problems?

Maxwell II BIOS Tweaker can open the 680 Kepler bios just fine for this purpose.

680L_unlocked.zip 56k .zip file
Edited by mahanddeem - 2/13/16 at 6:39pm
post #3549 of 7277
Thread Starter 
Quote:
Originally Posted by mahanddeem View Post

Can you Laithan take a look at this 680GTX Lightning BIOS?
The power table shows 600 watt in all the tables in there, TDP, Rails, PCI slot, power limit, everything.
I am using this BIOS for my 680 Lightning and I have no problem with it. works fine there.
Is it safe to assume even if you put 600 watt in all fields would not cause problems?

Maxwell II BIOS Tweaker can open the 680 Kepler bios just fine for this purpose.

680L_unlocked.zip 56k .zip file

Thanks, I have KBT also. I think all those values are too aggressive potentially not even safe. Can't just set everything to 600W smile.gif

Default and max values for power should never be identical either. Set the default slightly lower if you want to force max power all the time.
post #3550 of 7277
MSI made a mistake in this bios?biggrin.gif

And by never be identical, you mean TDP or power limit table?
New Posts  All Forums:Forum Nav:
  Return Home
Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA Drivers and Overclocking Software › GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ┌(ô益ô)┐