CPU-Z seems to be reading the actual vcore pretty accurately - I set it at 1.505v, and my board kicks it to 1.5072v....Why wouldn't I game on it? That's the major purpose for my PC.... Also, it's like stress testing, but quite a bit more entertaining....Tomorrow, I'll probably run some Far Cry 3, since it seems to be pretty good at finding instabilities in overclocks.
Check it with a DMM, you'll be nearer to 1.55v than 1.5v. Why wouldn't you game on it? Because going from 4.6ghz to 4.8ghz won't make much of a difference for gaming but running 1.5v+ 24/7 will kill the chip. I'm sure Klepp's already killed one running 1.5v+. On the flip side, if you've got the Intel warranty plan then no worries and you might even get a better chip in return! That H100i must be on steroids!
A 1.5v VID on my chart would mean you would have the highest VID setting out of everybody. I need that input voltage. I found my evidence that input voltage matters on higher Vcore by testing over and over again via x264. Average time in minutes until Bsod @ 1.85, 1.95, 2.05, 2.15v input voltage and I found conclusive evidence that my stability increased for each higher setting. That said, I didn't test a combination of input voltage and VID, that would take eons to have enough test runs for a meaningful analysis.
I found this as well, x50 rock solid needed 2.15vrin. To get x51 stable for all benching needed 2.23vrin, x52+ i can only use on a small handful of benches as i refuse to set vrin higher or take vcore above 1.45v. 2.23vrin with 100% vdroop run's nearer 2.28vrin.
Edited by Doug2507 - 12/22/13 at 2:23am