Overclock.net banner
6,421 - 6,440 of 23,546 Posts
I am already using ultimate power plan in windows , every single eco setting off and oc is completely manual , i have disabled the windows defender stuff including vbs , also completely disabled realtime protection that steals performance as it runs all the time , almost everything off in the performance options of the system properties. On top of that i am using latency reducing commands. On the background only the very important apps are running which are like 3.

It doesnt get any better than that without seriously playing with the hidden cpu settings within win or going custom os as you mentioned.

Only thing i didnt try is raising the priority.



i am already using the ultimate performance power plan and every single eco setting is off as many things on top of that as i explain above to Yzonker that increases performance, this tool seems useful i will give it a try and report back.



You can try ocing the card further or if you have already oced it downclock the card to see if there is a performance drop and spot if there is a bottleneck.

Also change your monitors resolution to 1080p and search if there is a "no scaling" setting in your graphics setting (scaling may reduce performance) and then run the benchmark on Fullscreen not Borderless.
there are lots of options to play around with with just the power plan alone!
I wouldnt really recommend installing a "custom OS" because its all closed source, you never know what bs could be installed.

My advice as someone who works on his own windows is that you should also do it yourself with NTlite.
It will take some time to find out what works best but its worth it in the end.
You have full controll over what you want and dont want + you can manage the services incase you just want a bench OS. You can get down to like 26-30 services this way.

Also how do you start the benchmark in the demo? I have not found the option to do so yet, i only have a 6900XTU but id like to see how many fps i get with my Windows + 14900K
 
TBH you shouldn’t also base your results off what a few people are posting. I mean we see people running 8600/8800 on the Apex. Does that mean everyone can? No ofc not
i have no interest in the Apex thread myself yet since i dont own it but im sure not everyone is posting 8600+ results.
The fact that a 4 dimm board can even do 8400 to begin with is insane, so even if it doesnt always do 8400 it seems like a worth while board for half the price of the Apex if youre looking for a good ram OC
 
i have no interest in the Apex thread myself yet since i dont own it but im sure not everyone is posting 8600+ results.
The fact that a 4 dimm board can even do 8400 to begin with is insane, so even if it doesnt always do 8400 it seems like a worth while board for half the price of the Apex if youre looking for a good ram OC
I don't disagree with you, I just wouldn't expect that 8K is a given, even on the Nova.
 
12600 to 14700 is better all around, I almost went with 12600k. Yeah if 14700k was $100 less, it would be a lot more tempting to me. The 14900k is too much, and then the total power if I OC goes up a lot, and my CPU AIO cooler can't handle over ~250W already.

The last few gen's, at any time in their for-sale cycle, except maybe used, the xx900k's have always been too much $$ for me. I had an i7-2700k, I guess that was the last top tier CPU I had.
 
there are lots of options to play around with with just the power plan alone!
I wouldnt really recommend installing a "custom OS" because its all closed source, you never know what bs could be installed.

My advice as someone who works on his own windows is that you should also do it yourself with NTlite.
It will take some time to find out what works best but its worth it in the end.
You have full controll over what you want and dont want + you can manage the services incase you just want a bench OS. You can get down to like 26-30 services this way.

Also how do you start the benchmark in the demo? I have not found the option to do so yet, i only have a 6900XTU but id like to see how many fps i get with my Windows + 14900K
Played a bit with Windows power plan settings explorer utility but I couldn’t get anything more out of the cpu and still have the same latency issues while idling.

I don’t know what custom os are doing and they can solve these latency issues as I have tested a few in the past and they were latency perfect. I believe the issue lies on the cpu idling settings in default win 11 as under load the latency massively improves and gets acceptable.

You just press play within steam and then it lets you choose what to run , just pick the cpu benchmark.
 
  • Rep+
Reactions: Tqhr
12600 to 14700 is better all around, I almost went with 12600k. Yeah if 14700k was $100 less, it would be a lot more tempting to me. The 14900k is too much, and then the total power if I OC goes up a lot, and my CPU AIO cooler can't handle over ~250W already.

The last few gen's, at any time in their for-sale cycle, except maybe used, the xx900k's have always been too much $$ for me. I had an i7-2700k, I guess that was the last top tier CPU I had.
The 14700K and 14900K generate the same heat but the 14700K does less work.

The 14900K is more efficient and also undervolts better. What I am trying to impress on you is that if you believe the 14900K is too much for your AIO, the 14700K will be also.

A 14700K is a 14700K because it runs hotter than a 14900K and they have to fuse it to a slower speed and disable 4 ecores because if they didn't, it would be crazy hot and generate too much heat.

A 14700K is a failed 14900K that they had to cripple. But they sell it at a 200 discount and it still is a formidable cpu. But it still requires a strong aio for sustained multicore.
 
Okay guys, anyone hava any idea w.t.f. I killed in my Windows?
Font Screenshot Technology Software Electronic device


What could lead Aida and Hwinfo to read wrong CPU usage while Task Manager is accurate?
 
instead of going through the thread yourself you "call bullshit" on the first screenshot you see? :ROFLMAO: well then
Probably because the first screenshot shows the test setup as "intel i9 14900k (ES), 2x24GB DDR5 Corsair Dominator Titanium 8000MT/s CL38 and Watercool Heatkiller IV CPU water cooler with MO-RA3 420 radiator. " Which is 4 DIMM board being run with 2 DIMMS on a custom loop with a MORA 420 and doesn't show a TM5 or y-cruncher pass.

So 'reliably hitting 8000' which is what @Mylittlepwny2 called bullshit on does seem to need an asterisk. Because the first screenshots dont show a 4 DIMM board using all 4 slots to hit that XMP and dont show heavy hitting RAM stability testing that would further validate those screenshots.
 
Guys I'm having an interesting issue with my 14900k overclock 5.7k all core on a 1.39v using my Asus TUF Z690 Wifi D4. Games that use DX12 or Unreal Engine 5 are constantly crashing on launch and I cant play them at all. The only way I can play is when I revert my CPU overclock to default but the temps and voltage on default are so terrible. Is there any way I can figure out exactly what it is with my OC settings that is throwing this off? It only happens with these games. When I load MW3 or Dota 2 or World of Warcraft I never have these issues even with my OC.
Sounds like your cpu voltage is too low when under load(vMIN). Crashes to desktop without error messages or crashes to desktop with "out of video memory" messages point towards the cpu not getting enough voltage under load. However, if you get crashes with directx errors it's likely the ram/memory.
 
My temps before with my underclock were 58-59C while gaming at 5.7ghz all core but with default its 82C. But problem is it doesnt boot the games. Just keep crashing on launch.
Cpu voltage is too low then. I had the same issue with games crashing. I lowered the cpu voltage too much. It seems some games are more sensitive than some benchmarks for testing vMIN under load.
 
After delidding I got my vcore down to 1.172 @ 57/44/48 but something interesting I had to do that I want to test more with was before I had to use 1.020 for Core PLL but in order to not get TLB errors in hwinfo I had to bring the Core PLL up to 1.030 (i'm on aio & delidded now so before this it would get too hot to drop the vcore any below 1.190 and Core PLL had to be 1.020 nothing above or below would work) Core PLL does something useful for sure, but what exactly I have no clue đź’©
Does Tuning the Core PLL help with lowering CPU voltage? If so how do you know what to set the Core PLL to? 🤔 I was stable in benchmarks at 1.275v vMIN under load for 5.9Ghz to 68c and 5.8Ghz after, but as soon as I started trying to play games they would crash. Had to increase my voltage back up.
 
Does Tuning the Core PLL help with lowering CPU voltage? If so how do you know what to set the Core PLL to? 🤔 I was stable in benchmarks at 1.275v vMIN under load for 5.9Ghz to 68c and 5.8Ghz after, but as soon as I started trying to play games they would crash. Had to increase my voltage back up.
Start with core pll at 1.02, 1.035 and up to see if it allows you lowering the vcore.
On my 13900 1.035 was the sweetspot, 1.05 on both 14900.
 
  • Rep+
Reactions: UTVOL06
Start with core pll at 1.02, 1.035 and up to see if it allows you lowering the vcore.
On my 13900 1.035 was the sweetspot, 1.05 on both 14900.
What's the default Core pll voltage?
 
Cpu voltage is too low then. I had the same issue with games crashing. I lowered the cpu voltage too much. It seems some games are more sensitive than some benchmarks for testing vMIN under load.
That is why you need to use the proper stresstest tools for that - and time. Edge-OC is probably not the best as a daily setting for gaming.

What's the default Core pll voltage?
Falkentyne said it was 0.9v if I remember correctly.
 
  • Rep+
  • Helpful
Reactions: logikju and UTVOL06
That is why you need to use the proper stresstest tools for that - and time. Edge-OC is probably not the best as a daily setting for gaming.


Falkentyne said it was 0.9v if I remember correctly.
It's 0.9v I just looked. I'm trying 1.05v and it raised my cpu temps under full load. Is that normal?

Edit: now I'm not sure if that's what's causing my temp to go up or not.
 
this is my 4080 13900KS(7800) at 5.7GHz

View attachment 2639020
P6.0/E4.7/R4.9 HT/off DDR5 8200 CL38
Screenshot Font Software Darkness Slope


Ok I gave it a go and HemuV2 your 4080 and 13900KS is blazing fast at 5.7.

There were some variations in score, like if you had internet tabs open it was lower, and there must be a few extra things to bump it, but yea I can't wait for the 5090 to replace it.
 
  • Rep+
Reactions: leonman44
It's 0.9v I just looked. I'm trying 1.05v and it raised my cpu temps under full load. Is that normal?

Edit: now I'm not sure if that's what's causing my temp to go up or not.
Start with 1.0 --20mV then 35mv then 50mv

So the logic should start from lower to higher.
 
It's 0.9v I just looked. I'm trying 1.05v and it raised my cpu temps under full load. Is that normal?

Edit: now I'm not sure if that's what's causing my temp to go up or not.
+1 @StreaMRoLLeR

Yes it is 0.9v.
I can't tell as it is harder to notice from my side.

By how much?
Could be an eventual slightly different ambient temp as well?

Try from 1.02 up to 1.05.
The sweet spot varies between cpu's, mainboards as well (not sure between an og Apex and Encore)

Sometimes It helps getting around 15mv improvement, it could also give nothing.
Pll tweaking was mandatory on my low sp 13900 to get 60x all core (1.035 on the apex, 1.05 with the elite ax)
 
  • Rep+
Reactions: UTVOL06
6,421 - 6,440 of 23,546 Posts