Overclock.net banner
1 - 20 of 7973 Posts

4,135 Posts
Discussion Starter · #1 · (Edited)

  • Special thanks →Naennon & WerePug

    {Threads that inspired me} (Joe Dirt & Zoson) (Click to show)JOEDIRT's NVFLASH thread (Click to show)[Official] NVFlash with certificate checks bypassed for...
    and ZOSON's MOD thread (Click to show)Extract and Flash GTX 970 and 980 Firmware -...

    First, I want to emphasize that modifying your card's BIOS, Flashing a BIOS or overclocking foolishly can/will not only void your warranty but can destroy your card! This thread is intended for experienced/technical people that wish to push their hardware to the maximum STABLE performance level. BE ADVISED, FLASH AT YOUR OWN RISK!

    ᕙ༼◕ل͜◕༽ᕗ - O/C GIGABYTE GTX9xx G1 - ᕙ༼◕ل͜◕༽ᕗ

    <START HERE> (Click to show)This "OP" (Original Post) is updated frequently in an attempt to provide YOU with a clear understanding of how to *safely extract additional performance from your GPU(s). *safely is NEVER guaranteed but neither is anything with any risk in the hands of the UNINFORMED. There's always more to learn and information to share so we're building a bit of a repository of information. Read on and more importantly, jump in and ask questions! don't be shy!
    Q.> Will this thread/guide help with ONLY Gigabyte cards?
    A.> A large percentage of the information, tweaks, strategies, techniques, you name it will equally or at least partially apply to ALL Maxwell based cards so feel free to play in this sandbox if you'd like! We're happy to help anyone despite who actually happened to have manufactured your GPU. All testing is done with Gigabyte hardware and will remain the primary focus.​


    The GIGABYTE GTX 9xx cards are CUSTOM PCBs not reference.
    » 'Ultra durable PCB'
    » 8-Phase VRMs / solid caps
    » (2) 8-PIN PCI-E power (2+6 for 970)
    » Gauntlet sorting?!? Better bins?! (Jury out on this one)GPU Gauntlet Sorting (Click to show)GIGABYTE says → Super Overclocked - SOC. GPUs that are cherry picked to handle higher frequencies. GPU Gauntlet sorting, to ensure the SOC video cards can handle the more extreme overclocks, and relative power switching needs."

    and more recently changed this to just say

    "With GPU Gauntlet™ Sorting, the Gigabyte SOC graphics card guarantees the higher overclocking capability in terms of excellent power switching." whatever that means..

    ...Ok ya whatever

    Gigabyte GTX 970 G1 Gaming
    My Backside (Click to show)

    Gigabyte GTX 980 G1 Gaming
    My Backside (Click to show)

    Gigabyte GTX 980 Ti G1 Gaming
    My Backside (Click to show)
    Unboxing Pics (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    EK Water Block unboxing (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?

    My Backside (Click to show)

    Gigabyte GTX 980 XTREME
    My Backside (Click to show)

    Gigabyte GTX 970 XTREME
    My Backside (Click to show)

    ßÎΘ$ Iñƒσ
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    ░▒▓WHAT IS MUMOD?? ▓▒░

    A corny acronym
    It stands for Maximum Unleashed MOD (modification)
    The word Max, from both Maxwell & Maximum, Unleashed for unleashing the full power and capabilities of your GPU.

    Now it makes more sense right?

    Max Unleashed BIOS
    No of course I don't have a trademark lmao!

    Please continue below to learn more about MUMOD for your GPU!


    UNIFIED BIOS FOR 970/980:The MUMOD 970 and 980 BIOS' were designed with water cooling in mind. However since GM204 is a relatively small die, maxwell is more power efficient and Gigabyte included an outstanding cooler, even those on AIR were able to use the modified BIOS. There is no color code for AIR/H2O with 970/980.

    SEPERATE AIR/H2O FOR 980Ti:When GM200 first launched it was clear that "Big Maxwell" was not as versatile as it's younger brother. GM200 is a very LARGE die and therefore thermal output will challenge even the best coolers made. The voltages are always directly related to thermal output but it's on a larger scale. The MUMOD 980Ti BIOS' development showed that we must split into two tracks. A seperate BIOS for 980Ti's on AIR and H2O.

    Looking at the actual BIOS files themselves, there are 2 naming conventions used.

    → Gigabyte GTX 970/980 = 980F3DP-MAX-UNLEASHED-REL1.1.rom
    → Gigabyte GTX 980Ti = H2OMUMODV1.0980TiG1F4DP.rom

    BLACK(980Ti only) = AIR or H2O BIOS
    RED = WHAT GPU THE BIOS IS FOR (there are now many variances G1, xtreme, waterforce, etc.)
    GREEN = THE GIGABYTE BIOS VERSION (There is a specific BIOS for each PCB revision and memory change)
    BLUE = YOUR GPU HAS A DUAL BIOS! Do you know which one you are using? You NEED to.. BEFORE flashing..


    If you are planning to use a modified BIOS then you should NOT be looking at manufacturer's recommendations for a power supply any longer. With increased power comes increased power supply requirements.

    = 350W GPU / 600W Total PSU minimum STRONGLY recommended
    = 700W GPU / 1000W Total PSU minimum STRONGLY recommended
    = 375W GPU / 650W Total PSU minimum STRONGLY recommended
    = 750W GPU / 1000W Total PSU minimum STRONGLY recommended
    = 475W GPU / 700W Total PSU minimum STRONGLY recommended
    = 950W GPU / 1200W Total PSU minimum STRONGLY recommended

    You will NOT draw maximum power under MOST normal use conditions however you also never really want to run your PSU at maximum capacity. It's usually recommended to have your "full load" be using approximately 80% of your PSU's capacity for optimal efficiency. DO NOT use a modified BIOS if you cannot satisfy the minimum power requirements‼‼


    NOTE: An OVERCLOCKING / MONITORING application is mandatory (to fully unleash the MUMOD performance and benefits). MSI Afterburner v4.2.0 is highly recommended for use with MUMOD BIOS.

    Get the most out of the hardware you paid for!​

    What are the benefits/advantages when using the MUMOD BIOS?

    NO PERFCAPS (◣_◢) (requires voltage/power% sliders maxxed for this feature)
    ☼ Throttling VIRTUALLY ELIMINATED ( º¿⌐ ) (Power Virus' excluded, requires voltage/power% sliders maxxed for this feature)
    ☼ Power % slider increased to 150%
    ☼ Voltage slider increased to +100mV
    Improved BIOS fan profile. More aggressive fan settings to compensate for the BIOS modifications. Custom fan curve recommended. STOCK fan math incorrect percentages are corrected
    ☼ Default GPU CORE speed increased / Default GPU BOOST speed increased (Benefits to BOTH 2D and 3D rendering/processing)
    Power capacity and TDP increased (Increase is evenly distributed acros PCI-e power sources and within maximum rated wire specifications)
    Voltage increased over STOCK BIOS values
    Voltage slider range increased to +100mV/+106mV (depending if using MSI AB or PX respectively. Same max voltage is applied with both. Use +100 mV for overclocking)
    Power % slider range increased to +150% (Must be set to maximum 150% when overclocking)
    Fully functional Voltage and Power% sliders finer variables vs STOCK BIOS (more set points)
    Power Management supported (Both Adaptive and Prefer Max Performance power modes)
    BOOST clocks are now accurately reported in GPU-z (You will know exactly what BOOST speed to expect when overclocking)
    BOOST clocks are now very consistent What you set your overclock to and what you see in GPU-z, is what you will get.
    Handles simultaneous high GPU/high MEMORY overclocking (STOCK BIOS will sometimes (not everyone) not have sufficient power to allow this)
    Corrects low voltage bug when using adaptive and high overclocking (STOCK BIOS can fail to provide enough voltage @ low power modes + high base overclock)
    Improved voltage table to provide a smoother voltage transition (Applies to both adaptive power mode as well as maximum performance mode.)
    Considerably improved boost response times (Transitioning from CORE to BOOST speeds. Measured from GPU CORE clock to GPU BOOST clock in GPU-z)
    Thoroughly tested & proven for months in both BETA, RC and FINAL stages

    ¯\(°_o)/¯ ¯\(°_o)/¯

    Detailed breakdown of the features and improvements over a STOCK BIOS

    »» PERFCAPS have been eliminated when using ANY of the posted MUMOD BIOS'

    This means that you will NO LONGER see them logged in GPU-z and your GPU will NO LONGER be deprived of additional power/voltage if there is a condition where it is needed. This will increase your thermal output (to be expected) so keeping an eye on temps is your #1 (and really only) concern. Thank Gigabyte for providing superb AIR coolers and if you've made the move to a water block, you're results will be even more impressive.

    ABSOLUTE MAXIMUM GPU TEMP IS RATED AT 91C FOR MOST MAXWELL GPUs HOWEVER I WOULD NOT RECOMMEND ALLOWING YOUR TEMPS TO EXCEED 85C. You may need to open your case, clean up wires, blow out dust, remove fan obstructions, install additional cooling and add supplemental GPU fans directly around your GPU. Race cars don't use stock cooling, there's a reason for that so don't be afraid to UPGRADE YOURS!. Many have been able to find ways to keep their 980Ti in the 70's on AIR

    I still want more PERFCAP detail!! (Click to show)
    Some background on PERFCAPS: Perfcaps, also known as PERFORMANCE CAPS indicate when a particular GPU's performance is being restricted in one way or another. This is much like a "Governor" or "Limiter". There are several reasons you can be limited by perfcaps and all of them can/will be present with a STOCK BIOS depending on the specific workload and parameters used to test. All STOCK BIOS' restrict power and voltages and therefore will be prone to perfcaps.

    The best way to monitor your GPU for perfcaps is to download GPU-z. I'm not going to link it there are already links and it is easy to find. It's pretty much THE BEST GPU MONITORING app you can get! If you don't have it you should download it immediately and learn how to use it (it is very easy).

    UTIL perfcaps are when the GPU is idle. There is no workload (or not enough to even make it blink) and the GPU will report UTIL perfcaps.. THESE are NOT a bad thing. They are perfectly NORMAL. It's simply the GPU being basically IDLE. Once you put the GPU under load all STOCK BIOS' will instantly show one of many (or multiple at the same time) perfcaps. On the MUMOD BIOS' there are none.

    GPU-z LOG FILE perfcap numeric definitions

    * Power. Indicating perf is limited by total power limit.

    * Thermal. Indicating perf is limited by temperature limit.

    * Reliability. Indicating perf is limited by reliability voltage.

    * Operating. Indicating perf is limited by max operating voltage.

    * Utilization. Indicating perf is limited by GPU utilization.
    NV_GPU_PERF_POLICY_ID_SW_UTILIZATION = 16 (This is the only "GOOD" perfcap. It means the GPU is limited by utilization. It's not being asked to do anything

    * SLI GPUBoost Synchronization.

    There are visual indicators of perfcaps in GPU-z as well as within it's built-in logging feature. When viewing the LOG FILES, perfcaps will be represented by numerical values. These values can be ADDED TOGETHER if there is more than one perfcap reason occurring at the same time. This is also why the log file is the most accurate way to check for perfcaps and then read why they were caused.
    MUMOD BIOS in action, NO PERFCAPS! (Click to show)

    TO THROTTLE, OR NOT TO THROTTLE..that is YOUR question (Click to show)»» THROTTLING

    ► A STOCK BIOS will have basically 3 conditions when throttling can occur. When this happens the speed will begin to be reduced (throttled). These conditions and behavior will vary from GPU to GPU (some lucky people don't throttle much others a lot with a STOCK BIOS) however although the specific behavior may not be identical it would always be "similar".

    These 3 throttling conditions are:

    (1) Thermal throttling - Some STOCK BIOS' will begin to throttle performance as early as 65C

    (2) Power throttling - When conditions demand high power and the GPU is unable to provide the power required (I say GPU because I am assuming the power supply is NOT the reason (although it could be))

    (3) Voltage throttling - When the GPU is not being provided with a RELIABLE voltage signal or not enough voltage(usually when voltage is being restricted in BIOS and/or PSU issues))

    There are also conditions where a power drop can also cause a voltage throttle, so the two aren't always unrelated.

    It is also possible to have multiple throttle conditions occurring at the same time.

    »» *THROTTLING has been virtually eliminated when using ANY of the posted MUMOD BIOS'

    * I say "Virtually" eliminated because there are some rare conditions when throttling CAN still occur. These tweaks are not for free and the cost is heat so it's important to monitor your temps. Throttling has been "tweaked" but not completely disabled. We've changed the throttling behavior to put us back in charge
    . There are still safety mechanism built into the GPUs that still function we've just manipulated when/how they can occur.

    There are applications that are "Power Viruses" that are designed to just draw more and more power and eventually exceed just about any BIOS power limit. These power workloads are TORTURE for your GPUs, they are COMPLETELY UNREALISTIC and NOT EVER RECOMMENDED! They are terrible at verifying stability for one, they are only good for generating heat and an insane amount of power draw for two and you can break PSUs and/or overheat GPUs/VRMs with these Furmark/Kombuster apps (You've been warned!)

    How to read the fan settings:

    = 1st fan profile. This is currently configured for FANLESS mode. Once the GPU reaches 55C the fans will turn on @ 1,470RPM.

    RED = 2nd fan profile. This is currently configured for 77% fan speed @ 3,234RPM. This means from 55C to 75C, the rpm will gradually increase from 1,470RPM to 3,234RPM.

    BLUE = 3rd fan profile. This is currently configured for 100% fan speed @ 4,200RPM. This means from 75C to 85C, the rpm will gradually increase from 3,234RPM to 4,200RPM.

    ORANGE = Overall MIN/MAX rpm values and associated percentage for those overall values.

    The STOCK fan profiles have reduced overall RPMs sometimes, higher minimum fan limits than the fans are capable of, poor performance and mathematically incorrect percentage for fan speeds. These have been corrected in all of the MUMOD BIOS' (most improvements with the 980Ti AIR versions).

    VOLTAGE! POWER! (Click to show)»» VOLTAGES and POWER%

    ► VOLTAGE SLIDER AND POWER % SLIDERS MUST BE MAXXED FOR OVERCLOCKING MODE (This is mandatory otherwise you will starve the GPU of Voltage and/or Power)

    Although overclocking is best done when both sliders are maxxed out there are some who like to "fine tune". You CAN still fine tune with the MUMOD BIOS' however you must understand that:
    (1) When either the VOLTAGE or the POWER % is NOT maxxed out, you WILL get perfcaps. This is simply because you would be restricting the voltage and/or the power.
    (2) You WILL likely impact your maximum overclock (expect it to be reduced)
    Detailed Voltage configurations available (Click to show)The flexibility feature mentioned above may have not been clear. This means that if you have a need to lower voltage to reduce heat output, limit power due to a weak power supply in SLI mode, looking to save money on your electric bill for AFK use or whatever reason you might have to adjust your GPUs to your desired specifications.

    The VOLTAGE limits are impacted by your DEFAULT/MAX POWER allowed.
    This is why I am showing 2 scales, one with Power % slider @ default of 100% and the other @ 150%.

    This is a GTX980 G1 MUMOD example:
    Power limit at default of 100%
    +00mV = 1.212v
    +20mV = 1.212v
    +40mV = 1.212v
    +60mV = 1.212v
    +80mV = 1.212v
    +100mV = 1.212v
    With the power % being limited to 100%, this means the total power is limited to the DEFAULT power value, which is 256W. This is actually LESS than the STOCK OEM value, which is how we are able to limit the voltage to 1.212v. This is going to be a low power mode. Combined with adaptive power management, your heat output and power use will both be rock bottom.

    Power limit at MAX of 150%
    +00mV = 1.237v
    +20mV = 1.262v
    +40mV = 1.275v
    +60mV = 1.281v (software reads 1.275)
    +80mV = 1.281v (software reads 1.275)
    +100mV = 1.281v (software reads 1.275)
    As you can see here, by allowing more power +0mV is now a higher voltage value @ 1.237v. This is also the only way you are going to be able to obtain max voltage. Max voltage starts to be seen around +60mV but this may fluctuate so it is best to just max to +100mV when max voltage is desired.

    ►The voltages and the voltage tables have been completely customized/modified

    CLK BIOS segments and their purposes in relation to the voltage table & power management (Click to show)
    CLK 0 through CLK 25
    This range, having the lowest voltage of the table is of course for when the GPU is idle, or minimal load. The default when you install NVIDIA's driver, is to configure your GPU for "adaptive" power management. If you set to "prefer maximum performance" power management, these tables will never even be used. The reason I set all the minimums to 825mV in this range is because there were some unique instances where real low voltages were causing some issues (known issue). This also reduces the voltage range to the core GPU speed during a spike (adaptive power management will spike to the DEFAULT GPU CORE Mhz first, and then your GPU will boost above the DEFAULT CORE speed as normal). I increased the max voltage for these clocks very little.

    CLK 26 through CLK 54
    This is where the voltage curve really begins. Similar to above, if "prefer maximum performance" is selected, these tables won't be used either. This is basically the section of the table that is specific to adaptive power management. . Voltage slightly increased to help stability during boosting/returning to idle.

    CLK 55
    This is the idle CLK when "prefer maximum performance" is selected. Voltage is 1.075v, clock speed is 1291Mhz. Voltage doesn't play a big role here yet because your GPU is simply not being pushed enough (as soon as it does, it will boost higher).

    CLK 56 through CLK 74
    When placed under load, this is where BOOST 2.0 occurs. All these CLKs are BOOST clocks. It will boost to the highest CLK that it can based on a few different factors. If the heat is low and good solid/stablepower is available your GPU should remain at CLK 74, the maximum boost speed. In the event you are starved for power or heat builds or even too much voltage, it will work backwards, dropping back to whatever CLK it feels it is now "satisfied" with. Modifying your BIOS can help manage this behavior.


    ► Showing a screenshot is self-explanatory. You simply set you overclock in MSI AB and then look in GPU-z to see what it will be. As long as BOTH the voltage and power% sliders are maxxed out, this is the BOOST speed you WILL get
    . It's very quick and accurate (STOCK BIOS' are NOT accurate).


    (Under Construction)​

    HOW TO FLASH A MODIFIED BIOS! (Click to show)

    with the MUMOD BIOS


    ( ͡° ͜ʖ ͡°) ( ͡° ͜ʖ ͡°)

    EVEN IF you have flashed a GPU BIOS in the past, if you are not SPECIFICALLY familiar with flashing a GIGABYTE BIOS, you must pay close attention as it's more complicated than most GPUs (if you have a G1).

    ALL Gigabyte GTX "G1" cards have a DUAL-BIOS that does NOT have a physical switch to select them. There will be more detail on this later in this guide but the way that the BIOS' are 'selected' and 'activated' is through an auto-sensing process of which display outputs are currently connected WHEN THE PC IS POWERED ON. The names of the BIOS' are represented by two letters. DD and DP. The names of DD and DP are strange with the only difference being which ports are active (ALL ports are not always active at the same time, some are always disabled).
    Which display outputs each BIOS can control: (Click to show)This image does NOT tell you if you are using DD or DP! You MUST extract your BIOS and read it with MBT! Instructions below

    ☞ We don't know what DD or DP stands for. We do know that DP does NOT stand for "Display port" in reference to a Gigabyte BIOS (Just in case you thought it did or might). Just pretend they are named BIOS1 and BIOS2 to help avoid this potential confusion.

    ☞ You should be using NVFLASH to flash a custom BIOS. Do NOT attempt to use [email protected] from Gigabyte. HERE is where you can download NVFLASH. They both essentially do the same thing but you simply need the modified NVFLASH to flash a custom BIOS.

    <a class="attachment " href="/attachments/41789" title="">nvflsh645.218.zip 951k .zip file

    ☞ You do NOT need to disable SLI to flash. It's perfectly fine to leave it enabled.

    ☞ You CAN flash more than one GPU in your system back to back without rebooting in between.

    ☞ You MUST (to be safer): (1) Remove the setting to auto-load your GPU O/C app on bootup (2) Delete any existing overclocking profiles (3) "Reset" your overclocks for both memory and gpu back to stock (or whatever BIOS you have now). (4) CLOSE any GPU monitoring/overclocking and Geforce Experience BEFORE YOU FLASH

    ☞ Flashing a GPU cannot be done while the GPU's driver is in use. There are several ways to do this. You do NOT need to use "Safe mode" in windows (that's not necessary). Some people say that you need to uninstall your GPU driver, NO you do NOT need to (although this method works, it's just a lot more work than you need to do). All that is needed is for you to DISABLE your GPU in DEVICE MANAGER (newer versions of NVFLASH automate this). Once the GPU is disabled the driver is automatically unloaded and you're safe to flash.
    Make sure you know which version of NVFLASH you need/want! (Click to show)
    To me it is a no-brainer to use one of the newer versions (I recommend NVFLASH v5.218 by JoeDirt)

    Pay attention to this!!
    Older versions of NVFLASH (like version 5.206) are not compatible with a 980Ti.
    Older versions of NVFLASH (like version 5.206) required you to MANUALLY disable your GPU in device manager.

    Newer versions of NVFLASH (like version 5.218) are compatible with ALL Maxwell GPUs.
    Newer versions of NVFLASH (like version 5.218) AUTOMATICALLY disable your GPU for you in device manager (you NO LONGER have to MANUALLY perform this step)


    This version is compatible with all Maxwell GPUs and automatically disables the GPU for you (this is a really nice feature they added!)

    ( ͡° ͜ʖ ͡°) ( ͡° ͜ʖ ͡°)


    ► Install GPU-z first

    ► After it is installed open it up and click the button to SAVE BIOS. This is how you are going to make a BACKUP of your original BIOS

    ► Now that you have a backup copy of your BIOS you can now open up Maxwell BIOS Tweaker v1.36 that you downloaded earlier. Just extract MBT to the same folder that you saved your BIOS in. Open MBT and select your BIOS file.

    GREEN = DD or DP BIOS (Will either show D_D or D_P)


    Now you know if you are using DD or DP BIOS! You would only need to flash BOTH the DD and DP BIOS with a custom MOD if you changed monitor configurations often and would be USING BOTH the DD and the DP BIOS at times. It's perfectly fine to JUST flash the one you will be using and leaving the other one stock as well.

    NOTE: Although this is NOT LIKELY and probably extremely RARE, when flashing just make sure you DO NOT CHANGE YOUR MONITOR CONFIGURATION in between reading your BIOS and determining if it is DD or DP and flashing as it could accidentally activate "the other" BIOS. Flashing a DD BIOS to a DP slot or flashing a DP BIOS to a DD slot will partially brick the GPU. It can be fixed but you just want to avoid this by being SURE which BIOS you are using before flashing and NOT making any monitor changes until the flashing is completed.

    ‼☺ Check point ☺‼

    »» You should have a pretty good comfort level at this point in knowing which BIOS you currently have

    »» You should know if you are using the DD or DP BIOS and understand the differences between them

    »» You have made a backup copy of your original STOCK rom

    It's now time to figure out which MUMOD BIOS is right for YOUR GPU

    Short version = Just match your current version up with the MUMOD BIOS posted in the OP. If you have an F1,F2,F3,F4 BIOS then you can flash the MUMOD F4 BIOS. If you have F10 than you can flash the F10 MUMOD BIOS.

    If YOUR BIOS is NOT LISTED in the OP then you just need to open 2 COPIES of Maxwell BIOS Tweaker v1.36 side by side. One with the POSTED MUMOD BIOS and the other with YOUR EXTRACTED COPY of YOUR BIOS. Then simply copy ALL values from ALL tabs, 100% identically. Triple check to be sure!!

    Use TAB and ARROW KEYS to navigate. It doesn't take that long

    More details about Gigabyte BIOS versions! (Click to show)Gigabyte has released a very large list of Maxwell GPU SKUs of even the exact same product but with slight changes. This means that there are MULTIPLE BIOS' for the same product. The differences between these cards are minute. In MOST CASES it is simply when they change MEMORY BRANDS for the video ram (HYNIX to SAMSUNG for example)

    Gigabyte has a unique way of keeping tracking of all the variances within the same product line. This is where the Gigabyte version numbers come into play. F3, F4, F10, F51, F80?!?!? What do all these mean?

    It's actually quite simple. The very first generation of a particular GPU product (NOT necessarily released to the public) is always BIOS version F0. The single alpha digit of "F" never changes. The numeric value when it is a SINGLE DIGIT always represents the first PCB revision.

    Gigabyte would begin with BIOS version F0. If a new BIOS was needed it would then be F1. If another BIOS was needed it would be F2, F3, F4 and so on. If a BIOS is not posted on Gigabyte's support site that means they have NOT UPDATED the BIOS since it was released to the public. They only post updates NOT original versions.

    If you took that same model GPU and saw that it had an F10 BIOS, we now have two numerical digits and starting with a 1. This means that it is the 2nd generation PCB of that same model GPU. They would usually JUST change the memory used so the differences are usually minute yet the internal timings, voltages etc. used for the memory (and/or whatever else was changed) was obviously specific enough to release an entirely new BIOS meaning the "F0" series wasn't technically compatible.

    F10 would be the first version, then F11, F12, and so on. Same with F40.. F41.. F42..

    So now that you have this background and you've previously identified which BIOS version you currently have you now just need to match up your version with the MUMOD version posted

    Not to confuse you even more but.... (Click to show)You SHOULD NOT flash an F4 version to an F10 GPU - You SHOULD NOT flash an F10 version to an F4 GPU
    You CAN however flash an F4 version to an F3 GPU - You CAN however flash an F11 version to an F10 GPU

    Hopefully by this point you will understand the versions well enough to know which one is right for YOUR GPU. If in doubt, just ask in the thread we're happy to help

    It's now time to FLASH your GPU

    Unfortunately after all that information you might be disappointed to find out that flashing is super easy and fast
    . Having said that, what is more important is knowing what you are doing and being CONFIDENT that you know EXACTLY what to do and what NOT to do. If you don't, STOP NOW and ask for help please!

    If you are flashing your GPU for the first time you may need to REMOVE THE WRITE PROTECTION from the GPU to allow the BIOS to be written to.

    NVFLASH --protectoff

    The following commands are ONLY for those with SLI. All others ignore this section.SLI (Click to show)Since you have SLI you are going to want to flash both of your GPUs (assuming they are identical) and for this process you might need to know the NVFLASH commands that pertain to SLI.

    NVFLASH --list
    This will show a list of GPUs in your system. GPU0 is the first one. GPU1 is the second one up to GPU3 for a quad SLI system.

    NVFLASH -i0
    This will specify GPU0

    NVFLASH -i1
    This will specify GPU1, etc.

    For SLI systems simply flash the first GPU and then change the command line and flash the next one. You don't need to disable SLI and you don't need to reboot in-between each flash but you DO need to reboot after you are finished flashing.

    (STEP 1)
    Make a folder and put the MUMOD BIOS and NVFLASH.EXE and NVFLSH64.SYS all in the same folder. I will assume your folder is called C:\ROM

    (STEP 2)


    (STEP 3)
    Change into your folder. And run the flash command including the SLI command (if needed) and typing the exact name of the MUMOD BIOS you are flashing (I usually rename the MUMOD BIOS file to "flash.rom" in windows ahead of time just because it's easier to type the filename when flashing).

    It will come up and ask you if you want to flash, press Y if there are no errors or warnings and REBOOT when it finishes.

    Now after you reboot back into windows you can restore the write protection on the BIOS.

    NVFLASH --protecton

    That's it, you've flashed the MUMOD BIOS and you're ready to see what your GPU can do

    You'l need an overclocking app. I recommend MSI Afterburner 4.2.0

    Remember to keep an eye on temps. Do not exceed 85C!
    Use a custom fan curve
    Ensure your case is well ventilated
    Install/upgrade supplemental fans
    Keep case open if needed


    Enjoy the free performance just keep an eye on temps!

    SUPPLEMENTAL GPU COOLING FOR CHEAP #3601 (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    HOW TO RECOVER FROM A BAD FLASH (Click to show)[Official] NVFlash with certificate checks bypassed for...
    HOW TO READ BIOS FAN SETTINGS #2810 (Click to show)How to read BIOS FAN settings 2810
    GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    HOW TO INSTALL AN EK WATER BLOCK (980Ti G1) #2665 (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    PCI-E POWER INFO POST #111 (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    VOLTAGE SLIDER #91 (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    POWER % & POWER TDP % #188 (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?

    Helpful TIP to use a combination of ADAPTIVE and MAX PERFORMANC modes (if the drivers aren't buggy) GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    VOLTAGE TABLE SECTIONS #382 (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    UNDERSTANDING HOW TO USE THIS BIOS #444 (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?
    HOW I TEST (Click to show)I thought I would share how I do my testing, what utilities I use and what areas/values I am looking at.

    This explanation is going to be very short, and not very detailed because there is too much to cover. I just wanted to focus on a couple of areas that may help others fine tune things, understand how and where to find the information, and what to look for. i will be covering just the basics. If you have any other tips to share, please list yours also!


    ░▒▓│SOFTWARE LIST │▓▒░

    Overclocking can be done safely and can produce significant performance improvements when done properly. Everyone knows these days that 3D performance actually has very LITTLE to do with CPUs these days. (Click to show)Battlefield Hardline Benchmarked: Graphics & CPU Performance

    Overclocking your video card is by far the largest performance increase that you can make to your system for GAMING and CUDA apps such as [email protected]

    Here are the basics of what you will need (I'm not going to post links, you can find them easy):

    Θ A GPU overclocking utility (I use MSI Afterburner, they all work basically the same so pick your poison)
    Θ MSI Kombuster (this integrates into MSI Afterburner which is convenient. Furmark is NOT effective, more on this later)
    Θ GPU-Z, it is basically an essential tool for monitoring your GPU.
    Θ NVIDIA Inspector, does a lot of the same things as GPU-Z but they each have unique features. Has overclocking options also.
    Θ FFXIV Free benchmark @4k MAXIMUM
    Θ 3DMark for the Firestrike test (the latest one that they really couldn't manage to identify by any other name than just 3DMark)
    Θ Unigine Heaven or Valley benchmarks are popular for stability testing. EDIT: You CAN test with 4K by using a custom resolution


    ░▒▓│MSI Afterburner │▓▒░

    SEE ME (Click to show)
    I use MSI AB, you can use whatever program you like best. I wanted to point out that the overclocking and voltage settings are NOT enabled by default so if you are trying out MSI AB, be sure to go into the settings right away and enable like I have shown in the screenshot above. I also show that there is a button on the side of the GPU power % limit (with this skin) that reveals additional settings. If you are running on AIR, you want to make sure you prioritize on temp and increase the temp limit to avoid potential unwanted throttling.

    I also recommend learning how to configure the monitoring features of afterburner but FIRST go in and shut all of the default monitors off. If you monitor too many things at once you can cause crashes and we don't want any of that while overclocking. Only put on things you need like GPU temp, voltage, boost clocks and framerate.

    Make small changes at once and use a method to track the settings and the results. Trying to keep it all in your head will just lead to repeating tests and/or making mistakes.

    Be careful about using the "Start with windows" option if you are playing with different base clock values in your BIOS tweaking. "+100Mhz" could be a very different clock value depending on the BIOS settings!

    ░▒▓│MSI Kombuster │▓▒░
    SEE ME (Click to show)
    I don't use Kombuster as a performance OR stability test. It is able to trigger boost clocks in a windowed mode. That's the reason it is used. When I am testing a new BIOS I want to see what the default boost speed is going to be. The fastest way to do this is to just fire up Kombuster. It is not full screen so I can open it and then make adjustments with afterburner and monitor the changes in Kombuster. This is a good way to validate BIOS changes quickly that can provide a clue something is very wrong (such as really LOW boost clocks or really HIGH ones above what you would expect) without firing up an app like Firestrike. I can also easily monitor temps and TDP values. This is compared to Furmark a lot but Furmark doesn't trigger boost clocks properly.

    ░▒▓│GPU-Z │▓▒░
    SEE ME (Click to show)
    GPU-Z tells you just about everything you want to know.. but the #1 reason I use it is to monitor for perfcaps and to create log files.

    ░▒▓│NVIDIA Inspector │▓▒░
    SEE ME (Click to show)
    Very similar to GPU-Z and allows overclocking and further tweaking. As Oni pointed out earlier in the thread, you can view the power state.

    ░▒▓│3DMark Firestrike/UNIGINE │▓▒░
    SEE ME (Click to show)
    This is a very demanding test on your GPU and the test doesn't take too long to run. I feel that to FULLY validate your overclock, you need to test @ 4K resolution. Firestrike ULTRA will allow you to do this even if you don't have a 4K monitor (down scales) but unfortunately you need the professional version to get that test. I actually start by testing the overclock and/or new BIOS with Firestrike @ 4K once I've verified my voltages, TDP, boost clocks and temps are OK in Kombuster first. Unigine is limited to 1440P by default but you can use a custom resolution and manually specify 4K.

    "Artifacts" are irregularities that come in the form of screen flashes, texture flashing, red or green lines. You MUST sit there and STARE at the entire test and be very watchful for them. Just passing these tests do not indicate stability.

    Once you've reached a comfort level with your overclock and you can pass this test without artifacts, fire up your favorite game and then test for GAME STABILITY. Just because Firestrike runs fine doesn't validate your overclock just yet. You need to push it over time, tested and true. Don't forget changes in temps affect your overclock! SUMMERS ARE THE ULTIMATE TEST.
    CASE AIRFLOW AND DESIGN #1525 (Click to show)GIGABYTE GTX 9xx H2O/AIR BIOS Tweaking ?(ô?ô)?

    The biggest tip I can share is that the last thing you want to do is blindly overclock, walk away and not monitor temps and stability. Please be aware of what you need to look for and what programs you need to use and enjoy the "free performance" more safely!

    → THE BIOS ←


    (◣_◢) NO PERFCAPS!

    Warning!! READ THIS!

    There is a D_P and a D_D BIOS for G1 cards. If you are looking to flash them directly, you MUST make sure you
    (1) Have already confirmed that your card is compatible with the posted BIOS.
    (2) Know if you are currently using the D_D or D_P BIOS and flash the correct one (Gigabyte cards have a dual BIOS).

    for ALL OTHER REVISIONS extract YOUR BIOS (using GPU-z) and EDIT YOUR BIOS using this posted BIOS as a REFERENCE. Literally open up both BIOS' side by side with Maxwell BIOS Tweaker v1.36 and COPY EVERY SINGLE SETTING OVER. Double-check them all 3 times! I have as many of them posted for you as I can but I cannot maintain every single version.

    If you have any questions about this process just PM me. Don't make assumptions!
    GV-N970G1 GAMING-4GD (rev. 1.0/1.1) Overview | Graphics Card - GIGABYTE Global

    GV-N980G1 GAMING-4GD (rev. 1.0/1.1) Overview | Graphics Card - GIGABYTE Global

    GV-N98TG1 GAMING-6GD Overview | Graphics Card - GIGABYTE Global

    980Ti Extreme / Waterforce



  • Gigabyte GTX 980 Ti XTREME AIR BIOS

    MUMOD FINAL Release 1.1
    → <a class="attachment HM-tool " href="/attachments/40941" title="FLASH AT YOUR OWN RISK">AIR_MUMODV1.1_980TiXTREME_F10_DS.zip 146k .zip file
    → <a class="attachment HM-tool " href="/attachments/41493" title="flash at your own risk">AIR_MUMODV1.1_980TiXTREME_F1_DS.zip 147k .zip file
    → <a class="attachment HM-tool " href="/attachments/43874" title="FLASH AT YOUR OWN RISK!">AIR_MUMODV1.1_980TiXTREME_C6-G_DS.zip 146k .zip file
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    ░▒▓│ (◣◢) Gigabyte GTX 980 Ti XTREME H2O BIOS (◣◢) │▓▒░

    MUMOD FINAL Release 1.1
    → <a class="attachment HM-tool " href="/attachments/43873" title="H2O ONLYFLASH AT YOUR OWN RISK">H2O_MUMODV1.1_980TiXTREME_ALL.zip 440k .zip file
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    Gigabyte GTX 980 Ti WATERFORCE XTREME Gaming BIOS

    MUMOD FINAL Release 1.0
    → <a class="attachment HM-tool " href="/attachments/40007" title="FLASH AT YOUR OWN RISK">H2O_MUMODV1.0_980TiWATERFORCE_F10_DS.zip 146k .zip file
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    Gigabyte GTX 980 Ti G1 Gaming BIOS

    ░▒▓│ (◣◢) GTX 980Ti G1 AIR VERSION (◣◢) │▓▒░

    MUMOD FINAL Release 1.0
    → <a class="attachment HM-tool " href="/attachments/40004" title="FLASH AT YOUR OWN RISK">AIR_MUMODV1.0_980TiG1_ALL.zip 597k .zip file
    Both DD and DP versions included
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    H2O BIOS

    ░▒▓│ (◣◢) GTX 980Ti G1 H2O VERSION (◣◢) │▓▒░

    MUMOD FINAL Release 1.0
    → <a class="attachment HM-tool " href="/attachments/40005" title="FLASH AT YOUR OWN RISK">H2O_MUMODV1.0_980TiG1_ALL.zip 597k .zip file
    Both DD and DP versions included
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿
    Gigabyte GTX 980 WATERFORCE

    → <a class="attachment HM-tool " href="/attachments/44282" title="GV-N980WAOC-4GDFLASH AT YOUR OWN RISK">980F1WFDP-MAX-UNLEASHED-REL1.1GV-N980WAOC-4GD.zip 147k .zip file ←
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    Gigabyte GTX 980 Xtreme

    → <a class="attachment HM-tool " href="/attachments/45295" title="flash at your own risk">MUMODV1.0_980XTREME_F1.zip 147k .zip file ←
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    Gigabyte GTX 980 G1 Gaming BIOS

    → <a class="attachment HM-tool " href="/attachments/32732" title="FLASH AT YOUR OWN RISK!">980F3-MAX-UNLEASHED-REL1.1.zip 273k .zip file
    <a class="attachment HM-tool " href="/attachments/45689" title="flash at your own risk">980F11-MAX-UNLEASHED-REL1.1.zip 273k .zip file
    <a class="attachment HM-tool " href="/attachments/43259" title="flash at your own risk">980F42-MAX-UNLEASHED-REL1.1.zip 273k .zip file
    <a class="attachment HM-tool " href="/attachments/42906" title="FLASH AT YOUR OWN RISK">980F51-MAX-UNLEASHED-REL1.1.zip 273k .zip file
    <a class="attachment HM-tool " href="/attachments/43181" title="FLASH AT YOUR OWN RISK F60 BIOS">980F60-MAX-UNLEASHED-REL1.1.zip 273k .zip file ←
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    Gigabyte GTX 970 Xtreme

    → <a class="attachment HM-tool " href="/attachments/44416" title="flash at your own risk">MUMODV1.0_970XTREME_F1.zip 147k .zip file ←
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    Gigabyte GTX 970 G1 Gaming BIOS

    → <a class="attachment HM-tool " href="/attachments/32731" title="FLASH AT YOUR OWN RISK!">970F3-MAX-UNLEASHED-REL1.1.zip 273k .zip file <a class="attachment HM-tool " href="/attachments/40793" title="F4x only">970F42-MAX-UNLEASHED-REL1.1.zip 273k .zip file
    <a class="attachment HM-tool " href="/attachments/41681" title="FLASH AT YOUR OWN RISK!">970F13-MAX-UNLEASHED-REL1.1.zip 273k .zip file <a class="attachment HM-tool " href="/attachments/41832" title="FLASH AT YOUR OWN RISK!">970F51-MAX-UNLEASHED-REL1.1.zip 273k .zip file
    <a class="attachment HM-tool " href="/attachments/42422" title="FLASH AT YOUR OWN RISK">970F60-MAX-UNLEASHED-REL1.1.zip 273k .zip file
    <a class="attachment HM-tool " href="/attachments/45660" title="flash at your own risk">970F80-MAX-UNLEASHED-REL1.1.zip 273k .zip file ←
    ̿̿ ̿̿ ̿̿ ̿̿̿'̿'\̵͇̿̿\з=༼͜༽=ε/̵͇̿̿/'̿'̿ ̿ ̿̿ ̿̿ ̿̿

    Thanks to the community for the testing and input! Team effort!

The "Spoilers" in this post are gone which is why it is presented as a huge run-on. If you have any questions just ask, we are STILL here to help! :cool:


4,135 Posts
Discussion Starter · #5 ·
Not sure what the last one really does either.

Through the maxwell BIOS editor we are able to increase to "1.312v". Some people have tested this and report it is less and others have said it does in fact raise the voltage to 1.3v. I am not clear who is right but software monitoring programs will only show 1.275v max.

Overclocking Maxwell does seem to be mainly limited by voltage and not quite as straight forward to overclock giving us a challenge


Matcha Soda
1,218 Posts
Do you think running 1.3125V for a daily scenario is feasible? I understand for the 970s G1 has a separate dedicated cooler for the VRM with a fan on it for maximum cooling, but still I wonder if these boys can handle 1.3125V.

I'm also concerned that a GTX970 jacked up with volts and mhz reaches near R9 290 TDP yet the R9 290 comes with a 10 phase VRM.

4,135 Posts
Discussion Starter · #7 ·
Originally Posted by amd955be5670 View Post

Do you think running 1.3125V for a daily scenario is feasible? I understand for the 970s G1 has a separate dedicated cooler for the VRM with a fan on it for maximum cooling, but still I wonder if these boys can handle 1.3125V.

I'm also concerned that a GTX970 jacked up with volts and mhz reaches near R9 290 TDP yet the R9 290 comes with a 10 phase VRM.
The G1 has a stock 600W cooler but I am on a full cover water block which makes a huge diff in temps.

The question might just be --> What is the lifespan of a GPU under normal operation? 10 years? 20 years? 30 years? 50 years?
Even if only 10 years (which I doubt would be that low) and THEORETICALLY over volting reduced the life span in HALF to 5 years,
I sure hope I still don't have GTX 980's in my system in 5 years

All kidding aside, you need to hardware-volt mod before you can really do any damage to the GPUs.

Premium Member
5,381 Posts
I was only able to reach 1463.5mhz by raising the voltage to 1268.8mv in my bios with mbt 1.36.even if msi AB still says 1.200v. Kept crashing until I changed the voltage in the bios.

Matcha Soda
1,218 Posts
I want to congratulate you, but no matter how much the core frequency, you cannot change the one spec that is Elpida Memory.

At this point, if Manufacturers charged 10$ extra for Samsung memory cards, I would happily starve for a week to collect more

This is where the OC UK GTX970 is kinda gold, costliest GTX970 yet its built from higher quality components and 100% samsung memory guarantee. So far MSI,GBT,etc can eff off. Creating a premium OC'd version of a card, charging more than retail and then slapping in Hynix. Atleast OC top tier versions of cards should have Samsung Memory.

Premium Member
5,381 Posts
Agreed. I was able to get +1516 oc on the Elpida ram tho, but any more than that=green and red Xmas lights on my TV and a hard reset, lol. Started at 5400, so 6916 is nothing to complain about. But yeah, I miss Samsung vram.

4,135 Posts
Discussion Starter · #11 ·
To ensure we don't stray too much (no complaints, just sayin' ) I would like everyone's help to encourage some 'expert Maxwell overclockers', especially ones with G1 hardware (this custom PCB), to take a look at the OP and hopefully provide us some further advice for additional performance tweaking.

◕◕ The idea here is to start with version beta 1.0 of the BIOS and together test and tweak to improve the performance even further, if possible.

◕◕ If we are not able to push the performance further, it also serves to save others lots of trial and error.

In detail: Digging into voltage, boost clocks, power, TDP, etc. in both BIOS and overclocking software. What works for some, may not work for others but we can dive in and prove through community testing one way or another what works and what doesn't and help other understand what these settings do. I hope the end-result to be a wealth of information and to share refined/tweaked BIOS that represents our efforts and made available to all to tinker with as they wish.

Since I'm new here, I might need some help getting this to the right audience.


Thank you

4,135 Posts
Discussion Starter · #12 ·
I'm still working in the background on Beta v2. I've been able to make some progress through reverse engineering.

v2 so far, 1583 boost with no throttling. Look at those temps! AB is showing the results from a full 3DMark 4k run in SLI.


Matcha Soda
1,218 Posts
EVGA has special editions of their cards such as classy where you can have memory over-voltage, and some other special features. MSI has their lightening line-up. Gigabyte has 0. Atleast for their G1 cards they should enable memory overvoltage. Is there no rep from gigabyte at OCN?

I don't think posting to their official forums as a request for G1 memory over-voltage would help. Since EVGA and ZOTAC already do it I'm sure nvidia isn't denying this feature.

4,135 Posts
Discussion Starter · #14 ·
Originally Posted by amd955be5670 View Post

EVGA has special editions of their cards such as classy where you can have memory over-voltage, and some other special features. MSI has their lightening line-up. Gigabyte has 0. Atleast for their G1 cards they should enable memory overvoltage. Is there no rep from gigabyte at OCN?

I don't think posting to their official forums as a request for G1 memory over-voltage would help. Since EVGA and ZOTAC already do it I'm sure nvidia isn't denying this feature.
As we compare all of the cards out there in the round-up there are perhaps some 'better' choices for extreme overclocking. I haven't even thought about memory overvolt on these cards until you mentioned it but you are probably right, doesn't seem mentioned anywhere on G1 cards so perhaps not an option for us Gigabyte owners. I think with the apprx 1GHZ overclock on the memory with stock voltage we already get, I wonder how much we'd gain with more mem voltage but an interesting point nonetheless. We may have to just forget about mucking with memory voltage.

Last night I was messing around with voltage on my G1 cards and I can understand why people say you cannot monitor the voltage accurately with software. I don't have a multi-meter but I was able to determine that there were in fact voltage changes taking place even though software displayed a steady 1.275v.

It was simple really, I would adjust the voltage slider with Kombuster running until it reported 1.275v. Using that 'spot' on the voltage slider was my baseline where it first hits max "reported" voltage of 1.275v. I was able to complete a 3DMARK run but with some artifacts. Without changing anything other than voltage, I increased the voltage slider up about 10mv and re-ran the test each time with less artifacts until it ran clean. When I maxxed out the voltage slider and re-ran the same test, it crashed proving I was sending more voltage than it could handle.

SO..IMO I have been able to identify supporting evidence that the G1 cards with the custom PCB are allowing MORE THAN 1.275v to the GPU.

The question is how much more voltage does the Gigabyte custom PCB actually provide? I read elsewhere someone actually measured it at 1.31v (when software reports 1.275v) but cannot confirm personally (but no reason not to take their word I guess).

Perhaps we have the first variable, the VOLTAGE RANGE? This PCB is capable of between ~1 and 1.31v with a maximum voltage of 1.31v?
That IS an advantage for Gigabyte over some of the other cards out there (albeit maybe not the BEST but one of them).

I don't really have experience of knowing how to tweak the voltage tables to accommodate this voltage but will share what I learn as I go.

Anyone else testing?

Matcha Soda
1,218 Posts
I have so far tried 1.275V and pretty sure it goes till that based on TDP % numbers. I also tried 1.287V in BIOS but it wasn't stable at the frequency I wanted it to be so I just assumed my card is capped at 1.275V. It might be true as well because who knows what gigabyte did in rev1.1. Probably used an even cheaper VRM.

Premium Member
5,381 Posts
@ amd955be5670 ,
I ended up returning the 750Ti to Best Buy today, because it kept shutting the computer down. When I got there they had 1 left. I was lucky because after I installed the card, I saw it had Samsung VRAM
.Also, when I opened the new package, I notice that the card had a Quality Control sticker on it, signed by a human. I looked at old pics of the card that I returned, and no QC sticker....Lucky for me I guess. No wonder it was shutting down the PC!

Matcha Soda
1,218 Posts
We have hijacked and derailed this thread

But still, by QC sticker, do you mean 3~4 round small stickers with OK on it? 1 of my 560Tis which didn't have it went to 99C within 2mins of kombuster, and to 85c within 5 minutes of crysis. The other 560Ti with the 4 stickers didn't have this issue. The G1 Gaming I have doesn't have these stickers (I think). It might be under the backplate, but so far, temps are actually bad. I thought this could heat up my room but nope.. defective gpu! My 560Tis would warm upto 74c, but this doesn't even break 60c. Forcing 1.275v got the gpu upto 65c in bf4. Damn. I should have got R9 290

Premium Member
5,381 Posts
Nope, big rectangular sticker on the back of the card, about 1x2inches, with letter and x's in the boxes and a signature at the bottom. I dumped the bios and modded it right away, lol. EVGA750tiSCACX_1.2v_1400Mhz.rom ,lol. The old card was very loud at first. This morning loud rattling started, then fans stopped all together
and shut the pc down. Once I took it out, I could power the computer back on again. Anyway, the new card on the low side @ 23-27C, under full load 50-57C. 100W TDP, 75W PCI-E, 75W Power Limit. I also apologize Laithan for hijacking and railroading the thread.

Matcha Soda
1,218 Posts
So I have some results for you. Currently my best stable OC is 1.275V @ 1557mhz. I tried 1.3125V @ 1569.5mhz, and yep, it failed. So my conclusion will be, the VRM in Rev 1.1 or F50 is unable to supply more than 1.275V to the Core. Its set to 1.275V in the bios, so if the real world value is actually more than or less than that, I don't know. It could also be my limit is itself 1557mhz on this gpu, which doesn't make sense because adding 0.125volts to the core should atleast net 13mhz more out of the OC.

I wonder if any other Rev 1.1 user with F50 has applied 1.3125V in the bios and is getting more than 1575mhz out of their GPU.

This is a G1 GTX 970
1 - 20 of 7973 Posts