Overclock.net banner
21 - 40 of 77 Posts
Discussion starter · #21 ·
in photo number 5 you can see, min cpu cache and max cpu cache ratio, that is your mesh frequency. go little steps and test for some time on each, as it can take a while for errors to pop up when doing mesh. and loading prime probably wont find them. just using the pc gaming etc is probably the best way to spot them at least as far as i know. maybe someone else has a better way to test mesh i do not.

edit: you could start by doing 25 then 26 then 27 pretty quick 28 is probably going to work with out much voltage but after that you might have to start adding a fair bit.
I understand. Do I set min and max too, for example, 25? Or just max
 
I have owned 4 x299 chips and all of them do mesh (uncore) 30 with 1.15 voltage. You can adjust from there. Either take the multi higher at that voltage or voltage lower at that frequency. One thing to note, mesh overclocking it very sensitive, you will start having blue screens and can easily corrupt the OS if unstable. I have min and max set to the same number.
 
  • Rep+
Reactions: Formula383
Discussion starter · #23 ·
in photo number 5 you can see, min cpu cache and max cpu cache ratio, that is your mesh frequency. go little steps and test for some time on each, as it can take a while for errors to pop up when doing mesh. and loading prime probably wont find them. just using the pc gaming etc is probably the best way to spot them at least as far as i know. maybe someone else has a better way to test mesh i do not.

edit: you could start by doing 25 then 26 then 27 pretty quick 28 is probably going to work with out much voltage but after that you might have to start adding a fair bit.
So, I have set my RAM to run at 3800MHZ and upped the Mesh to 30, it does seem to have gained some frames, but not a lot. I tried to lower the timings but it crashed even if if lowered by 1 decimal on the main four timings :/
 

Attachments

  • Rep+
Reactions: Formula383
So, I have set my RAM to run at 3800MHZ and upped the Mesh to 30, it does seem to have gained some frames, but not a lot. I tried to lower the timings but it crashed even if if lowered by 1 decimal on the main four timings :/
Nice, your average frames will scale best with core clocks. but your minimum fps should go up a fair bit with lower latency. 3800 from 3200 with the same timings is a pretty good improvement on your over all latency. that coupled with the cache speed bump your minimums "should" be much higher. again you really cant play a multiplayer game to see the improvements very easily as every run will have variance in it. So the game should FEEL smoother with the faster (lower latency) memory and cache speed. if the game feels great already who cares what the number is. Doing a delid and liquid metal is a huge upgrade to your temps allowing you to get higher clocks basically another 100~200mhz with running the same temps. that would improve your average fps the most just raw clock speed. but its questionable how much better the game will feel.

as for your ram timings i'm sure you can probably do some tweaking over time to the secondary and tertiary timings. but often just setting the primary numbers to what the kit is rated at and then pushing the frequency a bit is about all that memory will do. (depending on the memory ofc) This is why Bdie is so great, you can get a 4400mhz kit at cl 19 19 19, and run it 4400mhz cl 17 17 17. If you wish to try to run your ram faster with the same timings you should bump the Vdim for the ram it self and then add some SA and IO volts to help keep the memory controller stable at that speed, from my finding 3800 is really easy for the imc to run, and over 4000mhz it gets a bit harder.

edit: you could start by trying to lower your tRFC, it might run down to around 280. Just take your time small steps down till it dont like it. often i will just save a bios profile then just start working down by 10 until it wont boot then add 20 and see how things go.
 

Attachments

My apologies i wasn't able to respond earlier. Looks like you have some good help on overclocking the cpu/ram/mesh. I missed you said warzone in your earlier post. I was generalizing in terms of cpu/gpu usage meters. If you heavily depending on a single threaded game workload then that thread would be hitting 90+% steadily. Its not fool proof but a good sign. In your case i don't see any thread staying in that range. Probably because the game is fairly multi threaded. That said, your GPU usage is very low indicating it is a cpu bottleneck.

I am not familiar with the game in question but i saw a review and i am not sure it makes sense to me, but they are using only up to 2080ti gpus. They seem to be getting higher FPS on an older cpu setup(6850k). Anyways overclocking will help you out, but i suppose temps and stability may be your next issue to tackle.
Call of Duty: Warzone PC Performance Review and Optimisation Guide | 1440p Performance | Software


If you are into tweaking and overclocking the x299 can be a good gaming platform(if your using older cpus like 78/79xx). Like folks have suggested the more straight forward things to get to your tackling now. You might consider if your into it, delidding your cpu and using liquid metal thermal paste. It requires a bit of work like getting a delid tool, cleaning old thermal paste out(pigeon poop)and putting a sealant over all the SMD's on the naked cpu, then Liquid metal thermal paste then remounting the IHS also with liquid metal. Many guides out there to do it, including videos from peeps like GN.

You probably would benefit most from cpu core clocks but like advised mesh/ram will help min etc...anyways good luck. in the end you can always get something newer..(maybe 10xxx?)
 
(I don't play warzone) But perhaps it just has bad ampere scaling? I know without going up to 4k in some games I see very minimal gains from 2080ti to 3090. 1440p isn't really where Ampere flexes. Have you looked to see if your PC is scaling well in other titles?
Its probably not a very accurate example but I explain it to customers like this: Ampere is more of a high torque diesel engine than a f1 engine. It goes the same speed no matter how hard you load it up. I play at 3440 x 1440, and there's barely any difference between low and high settings FPS wise. Even going up to 4k on the LG CX55 barely drops the FPS in most titles.

Then again I might be talking out of my ass because I haven't used HEDT since x99 and 5960x and it could be your mesh latency as others here have suggested. .. but just a thought.
 
Oh forgot to mention you MUST use high performance power plan in windows or it kills your fps. lol whoops. yea i'm using my 3600 + radeon vii rig right now and it runs 120fps on average (give or take) and the 1% low is around 97fps. i know for a fact my 7980xe is much faster than this 3600. i am not a huge warzone guy. but i know a little bit around it. you can also disable c states in the bios, there is probably one other power saving thing you can disable as well, sorry its been over a year since i have looked at the bios in my 7980xe rig. but ya i'm really thinking you must have some power management issues going on here, i'm assuming you have your gpu configured to prefer maximum performance and have you pre render frames set etc.

also disable any software you do not need who knows what it will do to your performance. WZ has ingame fps counter and ms for cpu and gpu, its really all you need to see whats going on. At least for right now.
 
Did some testing with my 7820x in warzone. i would say our setting cpu's settings are pretty close 4.7ghz 3.0mesh 3800 cl 16 16 16. the average fps was around 155 give or take and the 1% low 125fps. Soo your cpu had 2 more cores (and this game likes more cores) you should be very close to this as well. And i know my 7980xe is faster but that computer is at a different location right now.

The only thing i have differnt that i know of is i use amd gpu's the 7820x has a vega64 LQ. and the 7980xe has Radeon Vii. As does my 3600 use a Radeon Vii too. But i dont think the 3090 is the cause of this low ish fps you are getting. it must be a power setting of some sorts. Or maybe a software conflict. you did a DDU in safe mode before you installed the new gpu?
 
Discussion starter · #29 ·
Nice, your average frames will scale best with core clocks. but your minimum fps should go up a fair bit with lower latency. 3800 from 3200 with the same timings is a pretty good improvement on your over all latency. that coupled with the cache speed bump your minimums "should" be much higher. again you really cant play a multiplayer game to see the improvements very easily as every run will have variance in it. So the game should FEEL smoother with the faster (lower latency) memory and cache speed. if the game feels great already who cares what the number is. Doing a delid and liquid metal is a huge upgrade to your temps allowing you to get higher clocks basically another 100~200mhz with running the same temps. that would improve your average fps the most just raw clock speed. but its questionable how much better the game will feel.

as for your ram timings i'm sure you can probably do some tweaking over time to the secondary and tertiary timings. but often just setting the primary numbers to what the kit is rated at and then pushing the frequency a bit is about all that memory will do. (depending on the memory ofc) This is why Bdie is so great, you can get a 4400mhz kit at cl 19 19 19, and run it 4400mhz cl 17 17 17. If you wish to try to run your ram faster with the same timings you should bump the Vdim for the ram it self and then add some SA and IO volts to help keep the memory controller stable at that speed, from my finding 3800 is really easy for the imc to run, and over 4000mhz it gets a bit harder.

edit: you could start by trying to lower your tRFC, it might run down to around 280. Just take your time small steps down till it dont like it. often i will just save a bios profile then just start working down by 10 until it wont boot then add 20 and see how things go.

Will try that, ta. Also, my CPU has already been delidded.
 
Discussion starter · #30 · (Edited)
Oh forgot to mention you MUST use high performance power plan in windows or it kills your fps. lol whoops. yea i'm using my 3600 + radeon vii rig right now and it runs 120fps on average (give or take) and the 1% low is around 97fps. i know for a fact my 7980xe is much faster than this 3600. i am not a huge warzone guy. but i know a little bit around it. you can also disable c states in the bios, there is probably one other power saving thing you can disable as well, sorry its been over a year since i have looked at the bios in my 7980xe rig. but ya i'm really thinking you must have some power management issues going on here, i'm assuming you have your gpu configured to prefer maximum performance and have you pre render frames set etc.

also disable any software you do not need who knows what it will do to your performance. WZ has ingame fps counter and ms for cpu and gpu, its really all you need to see whats going on. At least for right now.
High-performance plan is set yes, I will disable c states. Maximum performance is set, pre-rendered frames are set to 1, not sure about that one?
Yes, i used DDD in safe mode.
 
Discussion starter · #31 · (Edited)
Did some testing with my 7820x in warzone. i would say our setting cpu's settings are pretty close 4.7ghz 3.0mesh 3800 cl 16 16 16. the average fps was around 155 give or take and the 1% low 125fps. Soo your cpu had 2 more cores (and this game likes more cores) you should be very close to this as well. And i know my 7980xe is faster but that computer is at a different location right now.

The only thing i have differnt that i know of is i use amd gpu's the 7820x has a vega64 LQ. and the 7980xe has Radeon Vii. As does my 3600 use a Radeon Vii too. But i dont think the 3090 is the cause of this low ish fps you are getting. it must be a power setting of some sorts. Or maybe a software conflict. you did a DDU in safe mode before you installed the new gpu?
Did some testing with my 7820x in warzone. i would say our setting cpu's settings are pretty close 4.7ghz 3.0mesh 3800 cl 16 16 16. the average fps was around 155 give or take and the 1% low 125fps. Soo your cpu had 2 more cores (and this game likes more cores) you should be very close to this as well. And i know my 7980xe is faster but that computer is at a different location right now.

The only thing i have differnt that i know of is i use amd gpu's the 7820x has a vega64 LQ. and the 7980xe has Radeon Vii. As does my 3600 use a Radeon Vii too. But i dont think the 3090 is the cause of this low ish fps you are getting. it must be a power setting of some sorts. Or maybe a software conflict. you did a DDU in safe mode before you installed the new gpu?
What settings did you have set in-game? Were you running at 1080p or 1440p? Maybe it's your tighter RAM timings that are giving you more frames. I'm honestly tempted to sell up and go with an i9 11900k, and some super-fast low latency RAM (B-Die). I only needed the high core count CPU while i was studying, have finished now. 8 cores and 16 threads is all I need now, and I'm pretty sure you can reach an OC of 5.5Ghz on the 11900k. I could also get a 5950x, which is a high core clock CPU and has more cores and threads than i have now :).
 
What settings did you have set in-game? Were you running at 1080p or 1440p? Maybe it's your tighter RAM timings that are giving you more frames. I'm honestly tempted to sell up and go with an i9 11900k, and some super-fast low latency RAM (B-Die). I only needed the high core count CPU while i was studying, have finished now. 8 cores and 16 threads is all I need now, and I'm pretty sure you can reach an OC of 5.5Ghz on the 11900k. I could also get a 5950x, which is a high core clock CPU and has more cores and threads than i have now :).
i would take the 5950x before any intel chip. the intel 11 series is very meh imo, your current 10core is a beast something is just not configured correctly. Warzone likes the core counts. my 32core TR gets better frames with much lower clocks than my intel cpu have. i would say it makes use of up to 16cores. But thats a lot of cash to spend on something that will only give you less power draw from the wall. i would not expect better performance, that said your current system is not working correctly at least as far as i can tell. and yes when i said low i ment low or disabled everything 1080p or 720p made no difference. When looking at your ram timings while they are not quite as good, they are fine at 3800mhz. again timings are mostly for your minimums.
 
High-performance plan is set yes, I will disable c states. Maximum performance is set, pre-rendered frames are set to 1, not sure about that one?
Yes, i used DDD in safe mode.
Ok good, i think you might want pre-rendered at 3 in the driver. also you might need to tweak your config for warzone. it has some "thread" count you can change and it can have huge effect on your performance in game.
 
As for power, you have disabled all the power limits in the bios yes?

keep in mind if you do remove the power limit things can get very toasty on the VRM these cpu's really do pull a ton of power.

if you run prime95 does it stay at 4.6ghz or i guess in your case if your using avx will it stay at your avx off set or does it throttle, us HWinfo64 sensors only check box on loading. monitor temps and check power draw. run it for a few min like 10~15 or until the temps stop rising and see if all the clocks stay set the whole time. dont let it cook the cpu at 110c tho, if its hitting over 105 we can use something else to test with. maybe just looping cinebench or something
 
Discussion starter · #36 ·
i would take the 5950x before any intel chip. the intel 11 series is very meh imo, your current 10core is a beast something is just not configured correctly. Warzone likes the core counts. my 32core TR gets better frames with much lower clocks than my intel cpu have. i would say it makes use of up to 16cores. But thats a lot of cash to spend on something that will only give you less power draw from the wall. i would not expect better performance, that said your current system is not working correctly at least as far as i can tell. and yes when i said low i ment low or disabled everything 1080p or 720p made no difference. When looking at your ram timings while they are not quite as good, they are fine at 3800mhz. again timings are mostly for your minimums.
Ok good, i think you might want pre-rendered at 3 in the driver. also you might need to tweak your config for warzone. it has some "thread" count you can change and it can have huge effect on your performance in game.
I disabled all in the attached screenshot but had to re-enable 'Turbo', as my overclock reverted to stock.
 

Attachments

I'm sorry if I missed it, but what kind of ram ic are you running? I have played with all the major ones and pretty much now safe numbers.
 
if i recall correctly i had to set my x299 apex into static vcore mode in order to achieve full performance, where as my asrock board i do not have to do this. i cant recall what happens when i just run offset or the normal overclocking mode but i do recall i really hated the fact my system used an extra 100w at idle..
 
Discussion starter · #40 ·
I'm sorry if I missed it, but what kind of ram ic are you running? I have played with all the major ones and pretty much now safe numbers.
Corsair Dominator Platinum DDR4 32 GB (4 x 8 GB) 3200 MHz C16 XMP 2.0 Enthusiast Desktop Memory Kit with Dominator Airflow RGB LED Fan Kit (Corsair CMD32GX4M4B3200C16 )
 
21 - 40 of 77 Posts