Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA Drivers and Overclocking Software › GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ┌(ô益ô)┐
New Posts  All Forums:Forum Nav:

GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ┌(ô益ô)┐ - Page 347

post #3461 of 7337
Thread Starter 
Quote:
Originally Posted by NikolayNeykov View Post

My first video card was geforce 2 and it was so good biggrin.gif

It required massive cooling smile.gif


Quote:
Originally Posted by Edkiefer View Post

Yup, I still have a S3 Virge 4Mb primary card I ran with 2 Voodoo2 back in the day.
They used to talk 8-32Mb cards, now were at 2-16gig .

OMG you brought back memories! S3Virge was so bad too after such success with Vision 64 (Stealth) cards. I'll never forget my Diamond Stealth 64 VLB smile.gif For those of you that weren't around then, VLB stands for "Vesa Local Bus" and was BEFORE PCI which was before AGP which was before PCI-e lol. It was a faster interface as it combined an 8-bit ISA slot with an additional connector for higher bandwidth (that would become PCI take a look at the connector, look familiar? smile.gif.




Quote:
Originally Posted by drop24 View Post

I have a G1 that came with the F10 bios. I use a display port cable though. Is there any way for me to get the DD bios for you? Let me know and I'll upload it.

biggrin.gif

Can you use a DVI cable at all? That would be the easiest way. The reason is that although the DD BIOS does support display port I think when you have ONLY a display port cable the GPU will always default to DP BIOS.



You can try using that display port output supported by DD shown above (when PC is fully OFF). When you turn it on it will either (1) Give you a black screen at boot meaning it doesn't like to use THAT output for a single monitor connection, (2) Will choose the DP BIOS because that port is shared by both or (3) Get lucky and it will activate the DD BIOS. The only way I know of to GUARANTEE the card will choose the DD BIOS is to use a DVI connector on the DVI-D port.

Does your monitor offer a DVI input also? Do you have an adapter DVI -> DP or maybe DVI -> HDMI adapater if your monitor supports HDMI?

Thank you for trying!
Edited by Laithan - 2/7/16 at 1:23pm
post #3462 of 7337
Quick question. . .

With regards to MSI afterburner, which setting is best to use.

Prioritize temp limit or power limit?

Cheers
post #3463 of 7337
Thread Starter 
Quote:
Originally Posted by combat fighter View Post

Quick question. . .

With regards to MSI afterburner, which setting is best to use.

Prioritize temp limit or power limit?

Cheers

Here you go smile.gif
On AIR, temp limit
On H2O, power limit

https://forum-en.msi.com/index.php?topic=161235.0

Quoted text (Click to show)
"Temp Limit

With the new GeForce GTX7xx / Titan series vgas came an enhanced version of the boost introduced with GTX6xx vgas. Boost 2.0 adds a Temp Limit to the previous Power Limit. So now the available boost is not only determined by the Power Limit but also by the set Temp limit. To change the Temp limit you need a supporting version of Afterburner (3.00 Beta 10 and above). Click on the down arrow next to the Power limit slider and Temp Limit becomes available.

--->

By default Temp limit is set to a target temp of 79°C what means anything below will allow max boost. Increasing Temp limit will allow boosting at higher temps. In standard configuration Power and Temp limit are linked what means increasing Power Limit will also increase Temp limit. If you want to change that and set individually uncheck "Link". Clicking on "Prioritize" you can decide if Temp or TDP Limit should be the primary factor to determine boost. "

Right-click open in new tab for hi-res
post #3464 of 7337
Hello guys, questions pleaee:
1. Artifacts when ocing gpu, problem of core or vram?
2. Does more voltage cure artifacts?
3. Does increasing voltage helps better vram oc from 7 to 8gps? Or it only helps core oc?
4. Why GTA V artifact more than other games? scientific answer please. "...pushes hardware too much..." Isn't a scientific answer.

Thanks
post #3465 of 7337
Thread Starter 
Quote:
Originally Posted by mahanddeem View Post

Hello guys, questions pleaee:
1. Artifacts when ocing gpu, problem of core or vram?
2. Does more voltage cure artifacts?
3. Does increasing voltage helps better vram oc from 7 to 8gps? Or it only helps core oc?
4. Why GTA V artifact more than other games? scientific answer please. "...pushes hardware too much..." Isn't a scientific answer.

Thanks

Stock or MOD BIOS?

(1) Could be either as artifacts were not described. Simple test, repeat with same GPU overclock but with memory @ stock. Some times artifacts can be caused by poor contact between GPU and cooler (the factory doesn't always get everything perfect) and/or poor contact with VRMs and memory chips. Sometimes removing the cooler and replacing thermal compound and heat transfer pads with upgraded ones can help. Other times it is just a bad GPU and needs to be replaced (like one of mine I had to return).

(2) It has a relationship but indirect instead of direct. There is no single answer that applies universally to all GPUs. Increased voltage can/will increase heat on both GPUs and VRMs which could create a condition where the artifacts are more prone to occur. On the other hand increasing the voltage could possibly remove artifacts if they were caused by insufficient voltage but I wouldn't say this is common and probably a very rare occurrence. The artifacts are 98% of the time because of an overclock that the GPU cannot handle. It is then up to us as a consumer to determine if that maximum stable overclock falls into the "defective GPU' category (obviously if you cannot even overclock to what a reference GPU could do on average for example then why did you pay extra for the premium PCB). If you have artifacts you are pretty much limited to the overclock where they do not occur any more and settle with that, attempt to correct a factory thermal application/contact issue or replace the GPU. My GPU that I had replaced was producing artifacts very easily and was clearly defective (I was getting them at less than 1430Mhz). The replacement doesn't produce ANY at all and the GPU will just TDR when pushed too hard.

(3) It has no effect at all on overclocking ram. Ram voltage is controlled independently and only few GPUs even offer memory voltage adjustment (MSI Lightning). From what I understand adjusting the memory voltage is pointless anyway. MOST people can run 8Ghz memory (effective) with no issue.. Some are pushing to 8.5Ghz and 9Ghz+ all on stock memory voltage. I don't think you would gain anything other than bragging rights of being able to adjust memory voltage tongue.gif

(4) Because it scientifically pushes your GPU harder.. It's next gen... tongue.gif Shaders are more complex, textures are higher res and larger size, more complex graphics are programmes, higher detail, more polygons and other features like Physx, Gameworks techs, ambient occlusion, PCSS shadows.. the list goes one.. Older titles just don't have the technical capabilities in the game engine that next gen titles do. Look at the difference between The Witcher 2 and the Witcher 3...

PS. If you want to find out what your stable GPU overclock is, run FFXIV free benchmark with MAXIMUM settings @ 4k resolution (or DSR 4x) and you'll most likely have to lower your overclock a little to pass that test. It's next-gen that isn't no firestrike test smile.gif
post #3466 of 7337
Heya Laithan, as a suggestion, that picture that you post to help determine if you are DD or DP Bios isn't very helpful. All it shows is the exact same connectors, some dark in one picture, some dark in the other, and a bunch blue on both. But there's no legend or explanation as to what the colors mean. Does it mean "if your connector is plugged into a dark connector, you have this BIOS"? It's not as self-explanatory as you may think. It doesn't matter to me, I know I have DP because I only use displayport, but if I weren't and was trying to figure it out, that picture wouldn't help me much.
    
CPUMotherboardGraphicsGraphics
Intel Processor 5930k RAMPAGE V EXTREME NVIDIA GeForce GTX 980 Ti NVIDIA GeForce GTX 980 Ti 
RAMHard DriveCoolingOS
G.Skill 32gb DDR4-2666Mhz 15L Intel 750 PCIe SSD 1.2TB Thermaltake Water 3.0 Ultimate Windows 10 64-Bit Professional 
MonitorKeyboardPowerCase
ROG Swift Saitek Eclipse II EVGA SuperNOVA 1200 P2 ThermalTake v71 
Mouse
Razer A5090 Master of Destiny 
  hide details  
Reply
    
CPUMotherboardGraphicsGraphics
Intel Processor 5930k RAMPAGE V EXTREME NVIDIA GeForce GTX 980 Ti NVIDIA GeForce GTX 980 Ti 
RAMHard DriveCoolingOS
G.Skill 32gb DDR4-2666Mhz 15L Intel 750 PCIe SSD 1.2TB Thermaltake Water 3.0 Ultimate Windows 10 64-Bit Professional 
MonitorKeyboardPowerCase
ROG Swift Saitek Eclipse II EVGA SuperNOVA 1200 P2 ThermalTake v71 
Mouse
Razer A5090 Master of Destiny 
  hide details  
Reply
post #3467 of 7337
Thread Starter 
noitailimuh cilbup
arrowheadsmiley.png
post #3468 of 7337
Quote:
Originally Posted by Laithan View Post

Stock or MOD BIOS?

Thanks Laithan for your quality reply smile.gif
I was asking about mod BIOS (or software OC). My 980 G1 is one of the best cards I purchased ever and I had many NVidia cards the last 12 years. Rock solid at stock and can push 1530 8000 wth stock voltage 1.212v (temp never exceed 65C with OC), EXCEPT for GTA V, I get purple artifacts. ASIC is 75.6%
My MSI 680 Lightning has VRAM and AUX voltage adjustment, you're right.

For my 980 G1, I want to mod my BIOS and my question is what values I need to put in Power table to have max power possible so I don't use MSI AB power slider(keep it at 100%)? or you don't advice doing so?
As I understand for a stock 980 G1 bios, 336 watts(MSI AB power at 122%) is the max? I feel card can pass that wattage running 1530mhz+
I am using the F3 DD BIOS
Thanks
post #3469 of 7337
Sorry I don't have a DVI cable or input on my monitor.
post #3470 of 7337
I have flashed the RC1 bios for the 980 TI G1 and I have Perf Cap in Furmark.... smile.gif Prrf Cap reason : "Pwr". Is that normal ? Just wanted to make sure. wink.gif
Gaming Rig
(12 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7 6700K Asus Maximus VIII HERO Asus GTX 1080 ROG Strix OC 4x8GB G.Skills Ripjaws V 3200mhz 
CoolingOSMonitorKeyboard
Corsair H110i GTX Windows 10 Pro 64 DELL U2711 Logitech G710+ 
PowerCaseMouseAudio
EVGA Supernova 1200W P2 HAF X Logitech G502 Fiio E10 
  hide details  
Reply
Gaming Rig
(12 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7 6700K Asus Maximus VIII HERO Asus GTX 1080 ROG Strix OC 4x8GB G.Skills Ripjaws V 3200mhz 
CoolingOSMonitorKeyboard
Corsair H110i GTX Windows 10 Pro 64 DELL U2711 Logitech G710+ 
PowerCaseMouseAudio
EVGA Supernova 1200W P2 HAF X Logitech G502 Fiio E10 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA Drivers and Overclocking Software › GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ┌(ô益ô)┐