Overclock.net banner

1 - 1 of 1 Posts

105 Posts
Discussion Starter #1

While doing overclocking and BIOS modding tests on my R9 290, I started to wonder how accurate the software voltage reading actually is, and what does it exactly measure. I googled for "Vcore measurements with DMM", "VRM voltage multimeter", "GPU voltage measurement" etc, and came up with nothing. Found only a couple of tests that had been done on a CPU, more threads with speculation and instructions, but no factual results on GPUs or VRM Vout and Vcore both compared and so on. I figured it was time to take out my own multimeter and see the stuff for myself. Doing so was a great learning experience, and I thought it would nice to post the results online, somewhere where fellow souls could google this.

I hope this can give some sense of how VRMs behave and is interesting for overclockers. With few the caveats of me not being any kind of expert and the limited accuracy of the results, lets jump right in it!

Tests were done on one of my Gigabyte Windforce R9 290 OC model, water cooled with an EK full cover block and a beefy custom water loop. The second GPU was unplugged = off. Picture of the PCB (I originally took this picture when I upgraded the VRM thermal pads, to see how the TIM had spread, so it's not cleaned..):

This GPU board design is very similar to the AMD R9 290 reference design, the main difference being different choice of chokes and capacitors used in the VRM. It uses the same 5 phase design with IR6811 & IR6894 FETs. IR3567B controller is also used on some of the newer GPU generations and has been used in some motherboards.

I used a basic digital multimeter to probe the voltages. Here's a picture of the card in my system and some of the voltage probing spots I used at the back of the card:

My methods & terminology (don't sweat it):
  • Vcore = GPU core voltage, measured with DMM over a ceramic capacitor on the back side of the GPU core (I used the one shown in the picture).
  • VRM Vout = VRM output voltage for the GPU core, measured with DMM over a capacitor right after the VRM (I used the one shown in the picture).
  • VID = Voltage Identification, it basically means the programmed voltage in the BIOS. I also refer to it as the result of core VID + core voltage offset... VID was set to 1250 mV in BIOS and voltage offset was applied with software tools (from -100 mV to +400 mV).
  • Vdroop = Processor power delivery design feature, that is intended to increase stability and lifespan of the products under normal operation. I understand it as VID - Vcore.
  • Vdrop = Voltage loss, results from current passing through resistive medium
    Here referred to as VRM Vout - Vcore.
  • LLS = Load Line Slope, is used to define load current - Vcore relationship in VR design. Different levels of "Load Line Slope Trim" were applied to the VRM controller via BIOS modification (+40%, default=+0%, -40%, off).
  • LLC = Load Line Calibration, feature found on modern motherboards to adjust Vdroop from specification or as Asus has called it "CPU voltage Damper". Early models only had on or off setting, implementations vary.
  • AX860i Power = Corsair AX860i digital ATX power supply was used to power the system. Two separate PCIe power cables (they have 18 AWG wires) were used for the GPU's 8-pin and 6-pin connection. The amperages were logged digitally with Corsair Link Software and summed up to figure out total GPU PCIe power draw, limited accuracy.
  • GPU BIOS = Original Gigabyte BIOS, modified where needed for this test, OCP, OVP, TDP etc limits were modified, GPU core @1000 MHz, memory @1250 MHz, VID 1250 mV, AUX 1000 mV, Vmem 1500 mV, unless otherwise stated.
  • Stress test program: Furmark (v.1.11.0), resolution 1024x768. Gives about the hardest load one can encounter with GPU, fast, repeatable, doesn't load CPU.
Here come the results.

The measured Vcore over VID range with different Load Line Slope Trim settings:

Here's basically the same data, but turned into more easily visualized Vdroop (shown as negative difference):

Okay, even at default (@1250 mV) setting we see a pretty noticeable Vdroop of about 150 mV. As the voltage is increased the Vdroop is increasing with the increased current. As expected, trimming the LLS lower will reduce the Vdroop and vice versa.

Disabling the LLS outright will result in the voltage controller exclusively targeting the set VID, and ignoring the load variation. As we'll see, it will do whatever it needs to keep the Vcore as close to the set value a possible.

Here's measured VRM Vout difference to the set VID:

Because the voltage will drop by the time we reach the GPU core, it obviously needs to be more than VID at the VRM if we want VID to Vcore.

From these we can make up a Vdrop chart at different VRM outputs:

Here's a combined chart that shows Vcore, VRM Vout and Power:

VID 1250 mV + 250 mV = 1500 mV, default LLS, scaling with GPU core frequency:

VRM switching frequency impact at two voltage levels, default LLS, default setting is 500 kHz, tested 200 - 2000 kHz (over 800 kHz seriously not recommended for use):

I didn't put the software GPU voltage readings in most of the charts, because they actually follow surprisingly closely to the measured Vcore in this case.
(However be aware, that if you use tool like VRMTool to tamper the "Load Line Slope" on R9 290 it will mess your voltage readings and do no actual good. Same happens with current scalers and you need to compensate if you want accurate software reading...)

Additional raw data I just throw here:
  • Total system draw at idle is ~120 W.
  • Max PSU reported Power draw @ VRM Vout 1620 mV: 1089 W from wall, 988 W out, 948 W 12 V rail out, 29 A (6-pin) + 41 (8-pin) = 70 A to GPU PCIe 12 V.
  • Measured idle PCIe Voltage at connector pins, both: 11.99 V.
  • Measured under load (1250 + 300 = 1550 mV setting) PCIe Voltage at connector pins, 6-pin: 11.94 V, 8-pin: 11.80 V,
  • VRM Phase in voltages, [email protected]: 11.88, 11.88, [email protected]: 11.74, 11.74, 11.74, GPU-z report 11.50 V
  • VRM Vout phase voltages, [email protected]: 1438, 1437, [email protected]: 1435, 1433, 1432
  • VRM Vmem_out, idle: 1500, load: 1550
  • Vmem differences, closest chips: 1520, furthest: 1495
Raw Excel:

Excel earlier tests (not directly comparable, different BIOS, software Vcore messed up!), shows also other loads than Frumark:

Well there you have it! It's getting late, so no further analysis from me now.

ProTip: When probing Vcore from the back of the core, use steady hands, or you see some fine sparks
1 - 1 of 1 Posts