Originally Posted by XGS-Duplicity
Heads up, I don't know why it works now but it didn't in the past, but 500khz switchrate is now working extremely well for 5.3ghz ht/off. My required load voltage went from 1.38v in cbr15 to 1.36v regardless if water is 23c or 28c. I'm still on the same f9 bios. I'm still using the same exact profile minus the switchrate change. I don't know what changed but it works good now. If gigabyte people did something behind the scenes, cool and thank you. If not, well, cool anyway because my OC is now cooler, like still below 80c at 1.39v load voltage in cbr20 on just an aio and liquid metal. In the past, 500khz was just auto unstable for me but with medium llc and acdc 1/1, it's working well for pushing the limits. Other users may want to revisit 500khz switchrate if they are on dvid mode. I have not been able to verify if it behaves differently on manual vcore mode because i'm afraid to change my OC at this point since it is performing up to expectations. Just wanted to share my findings.
What Bios version are you using? Did you shut off AC Power and re-test it? I was on F11e.
My Z390 Master is unhooked with no heatsink or power, because I have no place to hook it up, and all the parts and two PSU's are on the Z490 Master and the M12E.
Can you try to replicate this after turning off all AC power? Because if you can't, this is some sort of bug, or something strange the VRM is doing. I saw that happen myself, once or twice, after I switched profiles. It happened once after playing Battlefield 5 at 5.2 ghz, then changing to 4.7 ghz and 500hz (on purpose), running LinX at a voltage I knew was not stable in LinX, and suddenly all residuals matched (usually required 300 khz). Then later it was unstable again during LinX at 500 khz (i was using LinX 0.9.6 residual matching to determine stability). Then I saw consistent results with more residual stability when testing 300 and 500 again. I also do not know if the IR 35201 is responsible for this, or the IR 3555 power stages.
Since I don't have an oscilloscope, I have absolutely no idea what the transient voltages were doing. What i *do* know is that people on the Maximus XI Apex and Gene tested 500 khz vs 800 khz and they all found 500khz needed lower vmin voltage than 800 khz. (the Gene and Apex use an ASP controller (relabeled IR 35201) and IR 3555 power stages, and the Aorus master and Xtreme (both benefit from 300 khz) use IR 35201 and IR 3599 power stages.