Originally Posted by wingman99
The CPU will run prime95 with the game without much performance loss because the game mostly uses integer calculations and Prime95 uses mostly floating point calculations. They are separate parts of the processor.
I would not use processor utilization to gauge gaming performance it is not accurate. Utilization is calculated time from windows on how long the gaming instructions take to process. So it takes in account of the memory time then delay from video card time also bus time. So when you see 80% for gaming that includes processor, Video card time, memory time, bus transfers time. So the processor by it self is only around 40% busy and can do plenty more work at the same time. Most folks notice while gaming there is not much processor heat compared to other programs because it's doing less work.
So another way to say what 80% gaming utilization is. The processor is idle a lot of the time waiting for the video card and memory instructions to be received and sent. Most background programs mostly need the processor and intermittently run quick 2-5% of the time.
The best way to see if the processor is holding you back in gaming is to overclock and see if the FPS increase or underclock and see if the FPS decrease.
Okay, this makes sense. Thank you.
I actually said that when I'm gaming, CPU temps are LOWER than when I'm using prime.. Just, that near 100% cpu usage is worrying me, like I said, I didn't had those "issues" on FX8300. Yes, i5 6600k is A LOT better than FX8300 just, you know.. old habbits.
Also if CPU is idleing and waiting for GPU, RAM... shouldn't CPU usage then drop? For example, I play Apex Legends (kinda like Fortnite if anyone didn't played) my CPU is almost always near 100% usage WITH GTX 950... So if I upgrade to let's say GTX1080, there will be A LOT more graphic stuff FOR cpu to process?
Now I know this is not TRUE comparison, but on FX8300 I'm playing Apex on let's say 40% of CPU, so in my head I see plenty of room before I'm bottlenecked by CPU, but here... man, it's cutting close (based on numbers). As far as OC, I did OC from 3.5GHz to 4.8GHz, and there was huge increase in FPS in CS:GO, other games are mainly locked at 60fps vsync.
I tried testing Warframe with and without OC and FPS unlimited (vsync off) , I put 4 pictures with all info from HWInfo... If you take a look :
3.9GHz default clock, 60fps vsync : Task Manager = 47% CPU ; HWinfo = 38% CPU ; XTU = 41% CPU (60fps)
3.9GHz default clock, unlimited FPS : Task Manager = 74% CPU ; HWinfo = 62% CPU ; XTU = 68% CPU (68fps)
4.8GHz overclock, 60fps vsync : Task Manager = 57% CPU ; HWinfo = 35% CPU ; XTU = 38% CPU (60fps)
4.8GHz overclock, unlimited FPS : Task Manager 89% CPU ; HWinfo = 61% CPU ; XTU = 58% CPU(68fps)
Which one is accurate? IF I overclock, i get SAME fps but higher CPU usage (at least according to task manager), if I leave it at default clock I get lower CPU usage and SAME fps...
This is in open world area but SOLO, no other players, map is HUGE, I'm standing there "idle-ing", that's why all 4 pictures are basically same with different settings on CPU and VSYNC. Also picture are a bit large because I'm using dual monitor setup and info is right there (this is not just because of this, or to test it, I actually use dual monitor all the time)