Originally Posted by robertr1
In my oc and learning adventures, it seems there's a relationship between VID and oc potential. I've been messing around oc'ing my 9900k (with HT off) and noticed something interesting over the past 2 days. Running the same tests and daily usage, my VID doesn't really go up much in relation to the vcore bump. I've gone from 5ghz to 5.3ghz and vcore bumped from 1.25 to 1.38 with llc high. So really everything's been pretty constant in between the system aside from the bump in frequency and vcore in bios. This is using static vcore.
Questions I trying to answer:
Why is adding vcore not making the vid go up in a similar fashion? In the screenshots below, I went from 1.34v to 1.38v for vcore (in bios with llc high) but max vid only jumped up by .010v in relation. This is running the same workloads?
Is LLC a static offset and not percent based? The vdroop difference is exactly the same ratio as the difference in the vcore bump I give it each time.
Screenshots below are HWinfo @ 5.2 and 5.3 after putting different loads on the system (real bench, cinebench, heavy desktop usage, cpuz, other benches and stress tests). Cinebench R15 @ 5.3 showing that system is scaling appropriately.
cpu z validation so you can see the full specs: https://valid.x86.fr/p1617w
Base VID is based on the CPU core ratio (assuming cache is 300 mhz lower than core, but not closer). Base VID will stop scaling once the highest turbo boost (1/2 core) multiplier is used, because the CPU's are not calibrated to exceed the max turbo boost. (so this is x50 for 9900K and x49 for 9700K), and at most lower multiplier steps, the base VID will decrease sharply at most steps, down to 800 mhz. Setting cache closer than 300 mhz or even higher than core ratio skews this.
VID also decreases by 1.5mv every 1C temp drop below 100C, so a 150mv range between 0C to 100C. This is called "Thermal Velocity Boost" (not the same thing as the laptop version, which affects turbo boost rather than VID), and this stops happening below a x40 multiplier.
AC Loadline is the CPU's power supply based on resistance (mOhms), which is used as the VRM target voltage signal on Auto (or DVID) voltages. This is based on load (current), so the VID target request will be boosted higher at load than at idle, and the heavier the load, the higher the target voltage request will be. This is limited to a maximum of 1.520v (Max VID on the Intel spec doc sheet). Now if this sounds like loadline calibration in reverse, it does seem like it indeed. But the difference is that this functions on the CPU's *requested* voltage, not on the vdroop on the requested voltage that the VRM is actually supplying (which is called VRM loadline, or "DC Loadline"). AC loadline (as far as I know) does not have a transient response (voltage spike/drop) penalty like VRM loadline does, because this is just a base voltage request. Although it's probably impossible to read the data what the VRM is receiving from the CPU in real time (oscilloscope speed here), but it can't be anywhere near what happens with the VRM voltage line (with all the transients at higher loadline calibration). Think of this as the "Fixed" voltage changing dynamically based on current draw. Since vdroop is going to be a nice huge healthy level (if LLC is left on the lowest level), you gain a huge stability benefit here.
The VRM ignores this on fixed override voltages but VID is still affected. This bias after AC Loadline factored in is the target voltage the CPU will request from the VRM directly. DVID offset will then be directly applied to this (+/-) if used.
If you want to see the theoretical VID request the CPU is doing, (there are no drawbacks to this--its fully safe), set DC Loadline manually to 1 (0.01 mOhms) in Internal VR Settings. You may be shocked at what you see.
While VRM Loadline is "DC" Loadline, the Internal VR "DC Loadline" DOES NOT CONTROL VDROOP FROM THE VRM!! It only controls VID droop on the CPU VID! The VRM IGNORES THIS VALUE! That's because DC Loadline is used for POWER MEASUREMENTS. The Intel spec documentation sheet states this. Basically, DC Loadline affects "VID" in the exact same way that VRM Loadline affects Vcore (VR VOUT). DC Loadline is used to calculate CPU Package Power because CPU Package Power is equal to VID * Amps.
VRM DC Loadline is actually "Loadline Calibration". That controls vdroop from the VRM and is shown as VR VOUT accurately. Unfortunately, there are only presets which don't show the mOhms values, but they are mOhms values. Standard and Normal are 1.6 mOhms. If VRM DC Loadline is set to the same value as DC Loadline (VR settings), when on auto vcore, VID and VR VOUT will be within 5 mv of each other. Using DVID offsets will change VR VOUT but not VID (remember I told you that DC Loadline does not affect the VRM at all?)
As you can see, it is VERY unwise to use a high AC Loadline (1.6 mOhms is maximum specification for 8 core CFL) and a high VRM loadline (loadline calibration)--this will cause dangerous voltages on auto vcore. If you want to see why, do what I said above, which I will repeat again---set DC Loadline to 1 (0.01 mOhms) and watch the CPU VID.
On auto vcore, assuming you are very intelligent and kept maximum vdroop enabled (1.6 mOhms, Loadline calibration=Standard / Normal) on Auto vcore, you may notice that the CPU VID and VR VOUT (drop at load compared to idle, and as the temps rise (temps rise=current goes up), VID and VR VOUT drop even more. This is assuming you set DC Loadline to match VRM Loadline (so, DC Loadline=1.6 mOhms (160) if you have Vcore Loadline Calibration=Standard/Normal. LLC=Low is 1.3 mOhms, equal to DC Loadline=130).
But what was that about VID rising 1.5mv every 1C temp increase? (and VR VOUT should match VID on auto voltage, so if you set Vcore Auto, LLC=Standard, AC Loadline=160, DC Loadline=160 (1.6 mOhms), why is VID and VR VOUT dropping here?
That's because the VID is capped at 1.520v (again check the experiment with your fixed vcore, and DC Loadline set to 1), and its reaching that as soon as a load is put on the processor. This is BEFORE DC Loadline drops the VID afterwards. So the VID cannot rise past 1.520v as temps go up. Just vdroop (1.6 mOhms * Amps= vdroop in millivolts) keeps dropping the VID and VR VOUT if on auto voltage.
If SVID OFFSET is enabled, this allows VID to exceed 1.520v by up to 200mv (Intel doc sheets also specify this as "Offset capability" via a MSR (do NOT confuse this with DVID!!!!). So if this were enabled, you would see VR VOUT and VID *RISE* at load as temps go up, since the 1.520v cap is removed, so VID could slowly increase as temps goes up, up to 1.72v. SVID Offset is only useful when AC Loadline is set to a low value (like 90). This allows you to reduce your idle voltage quite a bit, while keeping the load voltage the same or similar, as having SVID Offset disabled and AC Loadline=160. Obviously it is DANGEROUS (once again) to use any sort of Vcore Loadline Calibration on auto voltages when AC Loadline is doing the voltage boosting work for you.