Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA Drivers and Overclocking Software › GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ┌(ô益ô)┐
New Posts  All Forums:Forum Nav:

GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ┌(ô益ô)┐ - Page 570

post #5691 of 7438
Quote:
Originally Posted by jackspadeza View Post

Sup bro! There's a video from JayzTwoCents on YouTube where he broke the fan blade and basically just jerry rigged a new fan by cutting the wires and attaching a new fan directly to the card. How ever just make sure your fan can reach 4000rpm and has good static pressure (SP) as these are beefy fans on the rad. I wouldn't recommended putting one from your motherboards connecter as fan speeds should fluctuate with the cards temps!


I would also like to know about the extra six pin. However I did some testing and it didn't help at all. Clock speeds were still stable at the previous clock you were able to get. And the voltage was stable with just the two 8 pins anyway

I think the extra six pin is for Liquid Nitrogen Cooling (LN2) when they are pushing the voltage much further.

Thanks, i already use for push\pull Corsair SP120. I set them at 100% and they quieter and perform better than stock fan at 50%+. My temps at idle with adaptive power setting in Nvidia Control Panel 24-27c, and at full load never exceed 61c even after 2-4 hours of gaming with my OC. Stock fan just lies at bottom of my case and make vibrations and noticable noise. I saw JayzTwoCents video and just want to know if someone did this like him. On internet just couple people who did it, but no additional info. So... just cut off the wire and it will not affect pump?
post #5692 of 7438
If it were me. I wouldn't do it. The stock fan works perfectly on its own and I'm pretty sure those Corsair fans don't reach anywhere near that RPM.

Take the stock fan off. While still connected via the wire. Put the Corsair fan on connected to your MB. Check your temps in that config. If its fine. Then I suppose youre good!
Edited by jackspadeza - 7/6/16 at 8:32am
post #5693 of 7438
Quote:
Originally Posted by jackspadeza View Post

Take the stock fan off. While still connected via the wire. Put the Corsair fan on connected to your MB. Check your temps in that config. If its fine. Then I suppose youre good!

Already did it. Temps from my post #5691 with swapped fans.
post #5694 of 7438
Oh sorry my bad. Maybe if I opened my eyes. Cutting the wire will not affect the pump. Just make sure you insulate and you're super
post #5695 of 7438
Quote:
Originally Posted by Kbird View Post

Quote:
Originally Posted by NikolayNeykov View Post

Moved back to 361.75 driver for my 980 ti, as i sow in nvidia forums that is quite good for that card,
i think this new drivers make my card crash even not overclocked, i will report what is going on later.

Ps. There is one thing new that poped up in my taskbar icons on the right saying safely remove hardware and eject media
Nvidia 980 ti is there and printers and devices, why this icon appear?



I also have my GTX970 show up in the USB Safe Eject Area with certain NVidia Drivers , not with the Latest 368.39 ? though , not sure why?


361.75 was one of the 1st drivers with Thunderbolt included apparently , many have the eject issue....

https://www.reddit.com/r/nvidia/comments/42xqm0/geforce_game_ready_driver_36175_whql/
KB2
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7 5820K X99-A II Gigabyte G1 - GTX 970 Corsair Vengenance 32GB @3000mhz 
Hard DriveHard DriveOptical DriveCooling
Samsng 850 EVO 250GB WD Black 640GB x3 in Raid 0 Array DVD RW Corsair H80iGT Water Cooler 
CoolingOSMonitorKeyboard
(3) x 200mm Case Fans Win10 (2) Benq BL3200 32" + (2) NEC 23" M$ Natural 
PowerCaseMouseMouse Pad
Corsair TX HAF 922 Logitech RB-22 Trackball Mionix SARGAS Desktop 
Audio
Onboard 
  hide details  
Reply
KB2
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7 5820K X99-A II Gigabyte G1 - GTX 970 Corsair Vengenance 32GB @3000mhz 
Hard DriveHard DriveOptical DriveCooling
Samsng 850 EVO 250GB WD Black 640GB x3 in Raid 0 Array DVD RW Corsair H80iGT Water Cooler 
CoolingOSMonitorKeyboard
(3) x 200mm Case Fans Win10 (2) Benq BL3200 32" + (2) NEC 23" M$ Natural 
PowerCaseMouseMouse Pad
Corsair TX HAF 922 Logitech RB-22 Trackball Mionix SARGAS Desktop 
Audio
Onboard 
  hide details  
Reply
post #5696 of 7438
There is new driver 368.69 - i putted it because there is no 1080 or 1070 GPU's signs all over in description, like the previous (i don't want to see any Pascal shi t's), for now it's stable with my OC half hour of Witcher 3, will test it for longer after a while.
Edited by NikolayNeykov - 7/6/16 at 11:11am
post #5697 of 7438
NVIDIA to Unveil GeForce GTX TITAN P at Gamescom

NVIDIA is preparing to launch its flagship graphics card based on the "Pascal" architecture, the so-called GeForce GTX TITAN P, at the 2016 Gamescom, held in Cologne, Germany, between 17-21 August. The card is expected to be based on the GP100 silicon, and could likely come in two variants - 16 GB and 12 GB. The two differ by memory bus width besides memory size. The 16 GB variant could feature four HBM2 stacks over a 4096-bit memory bus; while the 12 GB variant could feature three HBM2 stacks, and a 3072-bit bus. This approach by NVIDIA is identical to the way it carved out Tesla P100-based PCIe accelerators, based on this ASIC. The cards' TDP could be rated between 300-375W, drawing power from two 8-pin PCIe power connectors.

The GP100 and GTX TITAN P isn't the only high-end graphics card lineup targeted at gamers and PC enthusiasts, NVIDIA is also working the GP102 silicon, positioned between the GP104 and the GP100. This chip could lack FP64 CUDA cores found on the GP100 silicon, and feature up to 3,840 CUDA cores of the same kind found on the GP104. The GP102 is also expected to feature simpler 384-bit GDDR5X memory. NVIDIA could base the GTX 1080 Ti on this chip.

https://www.techpowerup.com/223895/nvidia-to-unveil-geforce-gtx-titan-p-at-gamescom

thumb.gifthumb.gifthumb.gif
post #5698 of 7438
Thread Starter 
Quote from Bluesnews:

"VIVE & GTX 10 Series Display Port Issue

A post on Tom's Hardware warns of a compatibility issue between the HTC Vive and the Display Ports on NVIDIA's new 10 series video cards which prevents the VR headset from being able to use the optional DP interface. This is an ongoing issue, as there's a thread on the GeForce Forums and a thread on reddit on the topic, and each was started over a month ago. In spite of this, there has not yet been an official comment on the topic from NVIDIA, who have touted VR capabilities as one of the strengths of the 10 series. Thanks Shacknews via nin. "


lachen.gif

I mean really... this is the "Ultimate VR GPU" ?!?! You mean NOBODY TESTED AN HTC VIVE?!?!

4 words: More testing, Less marketing..


lachen.gif

Egg on face!!









Quote:
Originally Posted by Hunched View Post

I've finished my extensive benchmarking of my Gigabyte 1070 WindForce OC and I'm pretty disappointed, at best it's "lower middle-class" if that. I came from nearly a top 1% GTX 970 which doesn't help.
Here's my journey.

1. Setup
I started off after a DDU uninstall and complete clean install of just the newest driver and PhysX without changing anything in NVCP.
I ran a Firestrike benchmark without touching a thing right after.
Stock Firestrike Graphics Score = 18731

2. Core Overclocking
I installed MSI AB and maxed the power and temp limits (111%/92c) and increased the core clock by +10mhz each successful run.
I stopped at +110mhz where it usually crashed before a run could finish, and settled at +100mhz as there was no artifacting or any visual weirdness, just program crashes at +110mhz.
With only temp and power limit maxed and a core of +100mhz I got a Firestrike Graphics Score of 19348.
With the exact same settings but the fan at 100% the whole run I got a graphics score of 19498.
19498 GPU-Z Results Warning: Spoiler! (Click to show)

3. Memory Overclocking
With an established weak +100mhz core increase that stays under 2000mhz 99% of the time under load... I moved onto memory.
Increased in increments of 100mhz it wasn't until +700mhz that visual artifacting began. It persisted at +650mhz so I settled on +600mhz.
With this my Firestrike Graphics Score ended up at 20600.
20600 GPU-Z Results Warning: Spoiler! (Click to show)

4. The Frustration
I'm disappointed that my core is rarely ever over 2000mhz while others are getting 2100mhz+ with ease, and graphics scores of 21500+ while I'm stuck around 20500.
It isn't even 100% stable at 2000mhz either, it crashed once after hours of running. It looks like I'll probably have to lower things even further if I have issues while gaming, memory included.

What makes no sense is how the core clock causes crashes, there are no signs of instability visually or from GPU-Z like you would expect.
When the memory was reaching its OC limits, visual errors appeared before 3DMark or anything else crashed, you would expect this of the core as well.
When the core crashes, the driver never crashes, nothing spikes or drops prior to it happening in GPU-Z, there are no artifacts or visual issues. It's like something pulls the plug.

More frustration comes from the fact that increasing core voltage does nothing to help. It gives 0 overclocking headroom and in fact seems to add instability to your overclock. headscratch.gif
Running these clocks that cause 3DMark to hard crash in Battlefield 4 cause BF4 to completely lock up and freeze after a short while. Again, no artifacts or any visual errors of any kind.
No display driver crashes, no visual instability, no GPU-Z drops or spikes prior to crashing, it's as if it just doesn't get enough power for a split second, though the voltage does not drop.
The way the core becomes unstable and crashes is sudden and weird, I feel there's something wrong with the BIOS possibly causing this limit.
There should be other signs of instability somewhere, this is abnormal and abrupt.

Here are GPU-Z results of a crash with the voltage maxed out. Besides the maxed core voltage the settings are identical from Part 2 Core Overclocking, the core is still only +100mhz but the increased voltage gave a higher max core. The max core was not what GPU-Z recorded right before the crash, it was at just 2012mhz.
Increasing core voltage from 1.0620 to 1.0930 makes it crash more, and doesn't allow for any more mhz on the core clock. thumbsdownsmileyanim.gif
If I increase my core voltage, I have to lower my core clock to maintain stability. doh.gifWarning: Spoiler! (Click to show)

5. Conclusion
So I'm not too happy. My GTX 970 had a Firestrike Graphics Score of 14318 for what it's worth with this same system. Top tier 970 to bottom tier 1070.
I suppose at the end of the day, 14318 to 20600 is a 43.87% increase, and my stock 1070 score of 18731 to 20600 is a 9.97% increase.
This is assuming I don't have to lower my core clock to +90mhz or less which I may have to which is pathetic, hopefully it will hold up in games since it usually does in Firestrike.
Mem might have to go down to +575/550mhz if I ever see anything weird again but that's still pretty good, better luck than my core OC quite obviously.

If I could have achieved a score of 21750 like some people on 3DMark, that would be a pretty juicy 16% boost over my stock score.
I can't check my ASIC with GPU-Z to see just how bad it is since it's not yet supported, but I'm guessing 65% or less.
Lastly, my fans have been weirdly loud occasionally during RPM changes, they hit some real turbulence or something, vibrations. Hopefully that happens less.

Maybe I will be able to hit 2100mhz core when custom BIOS's are available, which will make it so increasing your core voltage ACTUALLY HELPS.
That could be useful, having a voltage increase that increases OC headroom and stability instead of ruining it tongue.gif


Thank you for the quality, comprehensive and detailed post! thumb.gif

I can't say I disagree with any of your points...except that ASIC% makes any difference wink.gif My theory, and I welcome anyone to prove it one way or the other, is that I feel that the days when ASIC% means anything is over. We saw what little difference ASIC% made with Maxwell and I think Pascal is going to even further reduce the variables PRIMARILY due to the fact that these chips are able to run ~2000Mhz with 1volt.... Think about that... ASIC% is voltage leakage... We've also seen how custom PCBs and better cooling doesn't really improve overclocks. I think Pascal is PURE silicon lottery this time with very little indirect relation to ASIC%...

How to verify: Just need multiple 1080 owners to post their highest O/C and what peak voltage was witnessed.. I can almost guarantee we will see a pattern where the SAME voltage is producing all of the variance in overclocks from the 19xx through the 22xx overclocks.



Quote:
Originally Posted by haofan7 View Post

Hey yes I've also tried it on my other PC with asrock mobo, mine was asus, not working either, same issue b2 error. I've read some post about driver bricking their GPU but seems to be fixable with bios flash, Im not really sure. However, I've tried all the stock and custom bios here it doesn't seem to let it boot so Im really out of ideas. So I'll have to look to RMA this boy I guess, just unsure if they would allow it

I was reading about this issue last night a little bit on my tablet and you're not alone...this has been an issue for YEARS also and you may just be lucky enough to have 2 boards that both need a BIOS update. From what I read, I don't think your GPU is the root cause either.. WAIT before you RMA. I would hate to see you do that for nothing...

The most common thing I have read was to either (1) Update the motherboard BIOS (I think this has to do with UEFI compatibility), (2) Disable UEFI and use legacy, (3) Increase the voltage somewhere (sorry I don't remember what it was).

http://lmgtfy.com/?q=B2+MOTHERBOARD+ERROR



Quote:
Originally Posted by nerko992 View Post

jackspadeza, i sent you my stock bios.

Hey guys.
I have couple question about 980ti waterforce.
1)I hate stock fan, i want to replace it, but fan wire have the same plug as pump wire. Can i just cut off fan wire, isolate it with electrical tape or with heat shrink and run new fans from motherboard? Will it affect pump performance? I tried to ask gigabyte support, but they didnt provide any info and just say "it's your card, do what you want".
2)For custom bios do i need to use PCIe power connector 8+8+6 or only 8+8?

Thanks in advance.


1) I don't know for sure if that would affect the pump to be honest, I would be lying if I did... so you'd have to roll the dice.. At least you could always re-splice the wires back together if needed. I know that the GPU BIOS controls that fan so of course no more fan readings or custom fan curves but for sure installing other fans from the motherboard headers can only improve cooling... Why not use a PUSH/PULL and just keep the stock fan and add another one on the other side? It would most likely cool better than any single fan.

2) Only 8+8.. Don't enable the LN2 switch, we are only modding the DS BIOS here. The additional power isn't useful/needed on AIR/H2O..


Quote:
Originally Posted by jackspadeza View Post

Sup bro! There's a video from JayzTwoCents on YouTube where he broke the fan blade and basically just jerry rigged a new fan by cutting the wires and attaching a new fan directly to the card. How ever just make sure your fan can reach 4000rpm and has good static pressure (SP) as these are beefy fans on the rad. I wouldn't recommended putting one from your motherboards connecter as fan speeds should fluctuate with the cards temps!


I would also like to know about the extra six pin. However I did some testing and it didn't help at all. Clock speeds were still stable at the previous clock you were able to get. And the voltage was stable with just the two 8 pins anyway

I think the extra six pin is for Liquid Nitrogen Cooling (LN2) when they are pushing the voltage much further.

^ well there ya go smile.gif


Quote:
Originally Posted by nerko992 View Post

Thanks, i already use for push\pull Corsair SP120. I set them at 100% and they quieter and perform better than stock fan at 50%+. My temps at idle with adaptive power setting in Nvidia Control Panel 24-27c, and at full load never exceed 61c even after 2-4 hours of gaming with my OC. Stock fan just lies at bottom of my case and make vibrations and noticable noise. I saw JayzTwoCents video and just want to know if someone did this like him. On internet just couple people who did it, but no additional info. So... just cut off the wire and it will not affect pump?

See what I get for not reading posts in reverse order tongue.gif


Quote:
Originally Posted by combat fighter View Post

NVIDIA to Unveil GeForce GTX TITAN P at Gamescom

NVIDIA is preparing to launch its flagship graphics card based on the "Pascal" architecture, the so-called GeForce GTX TITAN P, at the 2016 Gamescom, held in Cologne, Germany, between 17-21 August. The card is expected to be based on the GP100 silicon, and could likely come in two variants - 16 GB and 12 GB. The two differ by memory bus width besides memory size. The 16 GB variant could feature four HBM2 stacks over a 4096-bit memory bus; while the 12 GB variant could feature three HBM2 stacks, and a 3072-bit bus. This approach by NVIDIA is identical to the way it carved out Tesla P100-based PCIe accelerators, based on this ASIC. The cards' TDP could be rated between 300-375W, drawing power from two 8-pin PCIe power connectors.

The GP100 and GTX TITAN P isn't the only high-end graphics card lineup targeted at gamers and PC enthusiasts, NVIDIA is also working the GP102 silicon, positioned between the GP104 and the GP100. This chip could lack FP64 CUDA cores found on the GP100 silicon, and feature up to 3,840 CUDA cores of the same kind found on the GP104. The GP102 is also expected to feature simpler 384-bit GDDR5X memory. NVIDIA could base the GTX 1080 Ti on this chip.

https://www.techpowerup.com/223895/nvidia-to-unveil-geforce-gtx-titan-p-at-gamescom

thumb.gifthumb.gifthumb.gif


biggrin.gif nice post! smile.gif


I'm still waiting to validate my prediction however since NVIDIA wants to market an "in between" GP102 (more stinking marketing!! ugh) it looks like I'll be wrong for that reason alone.
http://www.overclock.net/t/1544574/gigabyte-gtx-9xx-10xx-g1-gaming-h2o-air-bios-tweaking/4950#post_25170869


speaking of marketing... all these artificial price hikes and especially with GP102, who wants to bet Titan-P will cost $1500... I'm saying it now..(I hope I am wrong).
post #5699 of 7438
Thread Starter 
Quote:
Originally Posted by Hunched View Post

I've finished my extensive benchmarking of my Gigabyte 1070 WindForce OC emory included.

(A) What makes no sense is how the core clock causes crashes, there are no signs of instability visually or from GPU-Z like you would expect.
When the memory was reaching its OC limits, visual errors appeared before 3DMark or anything else crashed, you would expect this of the core as well.
When the core crashes, the driver never crashes, nothing spikes or drops prior to it happening in GPU-Z, there are no artifacts or visual issues. It's like something pulls the plug.

(B) More frustration comes from the fact that increasing core voltage does nothing to help. It gives 0 overclocking headroom and in fact seems to add instability to your overclock. headscratch.gif
Running these clocks that cause 3DMark to hard crash in Battlefield 4 cause BF4 to completely lock up and freeze after a short while. Again, no artifacts or any visual errors of any kind.
No display driver crashes, no visual instability, no GPU-Z drops or spikes prior to crashing, it's as if it just doesn't get enough power for a split second, though the voltage does not drop.
The way the core becomes unstable and crashes is sudden and weird, I feel there's something wrong with the BIOS possibly causing this limit.
There should be other signs of instability somewhere, this is abnormal and abrupt.

(C) Maybe I will be able to hit 2100mhz core when custom BIOS's are available, which will make it so increasing your core voltage ACTUALLY HELPS.
That could be useful, having a voltage increase that increases OC headroom and stability instead of ruining it tongue.gif


Some feedback on your frustration.

(A) It is normal for some GPUs to TDR without ANY visual artifacts prior. I cannot explain why but I've personally had 4 Maxwell cards now (980's and 980Ti's) and the behavior when I crashed from an overclock that was too high varied from GPU to GPU. As a matter of fact when I was pushing super high overclocks (this occurred at 1600Mhz+) on my 980Ti's I noticed that I would see artifacts but I could remove them by LOWERING MEMORY OVERCLOCK... I know it's a bit strange but I think it has to do with power draw. Overclocked GPU memory is a power hog. It's like I reached some kind of overall max limit and if I wanted to push my GPU clocks higher I could only do so if I lowered memory overclocks and I was able to re-produce this and it scaled. Currently my 980Ti's are under water and since it's a full cover block I am directly cooling the memory and VRMs so thermal differences can produce different behavior even on the same GPU. The fact that we KNOW GP104 is power limited on all GPU's with a single 8-pin PCI-e power connector could be signs of the same patterns I experienced. IE: Lower your memory overclock, even underclock as a test.. Curious if stability improves indicating another power restriction..

(B) If the voltage is THAT SENSITIVE we've got problems. I think everyone is seeing this... it may still be early.. and maybe we can pump 1.25v into Pascal and see 2500Mhz+ but we just can't really MOD yet to prove it (infancy stages) frown.gif

sozo.gif


(C) If we can gain access to the Pascal BIOS through an editor like we had with Kepler and Maxwell, we WILL be able to modify/improve behavior and parameters. If the same tweaks apply we should be able to remove perfcaps, increase power and remove throttling (and others) even if we cannot increase overclocking headroom. We actually started out with a 980Ti BIOS that we could not fully tweaked. It was considered a "LOCKED" BIOS. 2 extremely valuable sliders for modding the voltage were missing.

Perhaps we just need to be patient but I'm actually thinking we might have our overclocking headroom hands tied this time around. Again, I hope I am wrong.
post #5700 of 7438
Thanks for the detailed replies Laithan! thumb.gif I don't think I really have anything to add, I agree with all that you've said. laughingsmiley.gif
Gonna just keep hoping for that Pascal BIOS editor.

Previously I mentioned my fans making annoying sounds, and they still do, especially every time they exit 0rpm mode it's a grind-like sound.
So I'm going to try to turn this into a positive and get an RMA out of it and another shot at the lottery, if my 1070 overclocked like a beast I'd probably just live with it.
I heavily value silence, but performance a bit more.

Gigabyte's QA seems to have dropped with the 1000 series, it's likely I'll get a replacement with the same fan issue, or worse I'll get coil while which I currently don't have.
NewEgg strictly only offers exact replacement returns, no refunds. I'd like to RMA upgrade to a 1070 with better fans and pay the difference really... MSI has the best quality fans correct?

I have a silence optimized case filled with Noctua fans and a PSU that is fanless 99% of the time because of its efficiency.
I'll be damned if I'm stuck with trash tier fans and a trash tier overclock, I want both to be good but I'll settle for at least one being fine.
Silence and performance are all that matter to me, and my Gigabyte 1070 has screwed me on both.

I can't find much so I'd like to know...
Owners of the Gigabyte G1 or WindForce OC, do your fans make an unusual sound when they start to spin up out of 0rpm/0db mode?
It should just be silence to the sound of fans spinning, mine hit turbulence or something and grind/buzz/whatever word you want to use to describe weird noise that shouldn't be there.
It's not a smooth transition is what I'm trying to say. I just want to know if there's even a chance I get a replacement free of this issue...
Maybe just PM me your answer so I'm not hijacking this topic or anything, I'd love you for it tongue.gif
Thanks! thumb.gif

Update: Here's my video of the issue

Edited by Hunched - 7/6/16 at 10:41pm
New Posts  All Forums:Forum Nav:
  Return Home
Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA Drivers and Overclocking Software › GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ┌(ô益ô)┐