Overclock.net › Articles › Intel Core I3 Vs Core I5 Vs Core I7 Gaming Performance With Geforce Gtx970

Intel Core i3 vs Core i5 vs Core i7: Gaming Performance with GeForce GTX970

Test System

[CPU] Intel Core i7 4790K
[GPU] Gigabyte GTX 970 mini-ITX
[MOB] Gigabyte Z97-D3H (F7 bios)
[RAM] G. Skill Trident X 4GBx2 DDR3 @ 2400 MHz 10-12-12-31 2T
[HSF] Noctua NH-U12S

Windows 7 64-bit with Service Pack 1
GeForce 355.82



CPU's Tested

Core i5 and Core i3 were all simulated using Core i7. Hyper Threading and active core count was configured in the BIOS. Turbo was disabled for all CPU configurations.

Core i5 - Core i7 with Hyper Threading disabled
Core i3 - Core i7 with Hyper Threading enabled but 2 cores disabled

Take note that Core i3 4160K is just a theoretical unlocked Core i3. I just included it due to the news that there will be a Core i3 6320 3.9 GHz. It's a Skylake CPU so I bumped the speed of Core i3 4160K to 4.2 GHz to take into account the performance advantage of Skylake over Haswell in a clock-for-clock comparison. Core i5 4590 has a base clock speed of 3.3 GHz but it boosts up to 3.5 GHz when all 4 cores are loaded.

Core i7 4790K @ 4.4 GHz
4 cores / 8 threads

Core i5 4690K @ 4.4 GHz
4 cores / 4 threads

Core i5 4590 @ 3.5 GHz
4 cores / 4 threads

Core i3 4160K @ 4.2 GHz
2 cores / 4 threads



Test Methodology

Gaming performance will be measured by using FRAPS to record the frame rate and frame time. For those who are new to frame times, it is the measure of time taken to render a frame and its unit is milliseconds (ms). The lower the frame time, the better. The 99th percentile frame time is a measure of the overall "smoothness" of game play. For example, a 99th percentile frame time of 20ms means that 99% of the frames were rendered within 20ms. A more detailed explanation of frame time benchmarking can be found here http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking

For those who want to replicate my gaming benchmarks, check the YouTube links below to see what part of the game I benchmarked. The settings used for each game are also indicated below. All the games are updated to the latest version as of September 09, 2015.

The Witcher 3 https://www.youtube.com/watch?v=w_a-r8HMeqU
Resolution: 1920 x 1080
HairWorks: Off
Ambient Occlusion: SSAO
Every graphics option was set to the highest setting available except for Shadow Quality and Foliage Visibility Range which will be both set only to High
The following graphics option were disabled: Blur, Motion Blur, and Chromatic Aberration

Grand Theft Auto V https://www.youtube.com/watch?v=7WTKhecDesA
Resolution: 1920 x 1080
FXAA: On
MSAA: 2x
Every graphics option was set to the highest setting available except for Grass Quality which was set only to High

Crysis 3 https://www.youtube.com/watch?v=aX1S4aSJ3lU
Resolution: 1920 x 1080
Texture Resolution: Very High
Anti-aliasing: SMAA T2X
System Spec: Very High



Performance Results

The results below are the average of 3 runs

RjrggfF.png

RTNiFn9.png



Conclusion

Looking only at the average frame rate and minimum frame rate, it seems that Core i5 and Core i7 do not offer much improvement over Core i3. But, as shown by the 99th percentile frame time results, there is a clear and very big performance gap between Core i3 and Core i5. Despite the 700 MHz difference in clock speed, Core i5 offers a better gaming experience compared to Core i3. So, it's not true that a highly clocked Core i3 will be a better gaming CPU than any Core i5. A quad-core CPU will always be better gaming option than a dual-core CPU with Hyper Threading.

Overclocking also helps reduce the CPU bottleneck which is very evident in Crysis 3. Though not obvious if you just look at the frame rate, Hyper Threading gives Core i7 a big advantage over Core i5 when it comes to gaming experience. Game play is much "smoother" with Core i7 and this is shown by the 99th percentile frame times in Crysis 3 and in Witcher 3. If you are building a gaming PC with a high-end GPU, don't ever doubt getting a Core i7.

Some may criticize my selection of games especially Crysis 3. Not all games have a CPU-intensive part. But if you will build a high-end gaming PC, it's better to have a CPU that is prepared for the worst-case scenario rather than skimp on the CPU now then later on realize that you need a CPU upgrade. I believe upcoming games will be more demanding. Even if DirectX 12 delivers on promise on performance improvement in CPU-bottlenecked situations, a Core i5 or a Core i7 will still offer more performance than Core i3 if the game developer will really take advantage of the CPU's potential.



== == == ==
== == == ==

Is CPU simulation a valid approach ?

Some may question the validity of simulating CPU's. Here is a video https://www.youtube.com/watch?v=0PhlwNfIUUE by Linus Tech Tips comparing the performance difference between a real CPU and a simulated CPU. A Core i7 5960X was used to simulate Core i5 4670K and Pentium G3258 by configuring core count, clock speed, and Hyper Threading in the BIOS. Then, the simulated CPU's were compared to a real Core i5 4670K and a real Pentium G3258.

The difference with a simulated CPU and a real CPU is the L3 cache size since it does not change even if CPU cores are disabled

Simulated Core i5 4670K - 20 MB L3 cache
Real Core i5 4670K - 6 MB L3 cache
Simulated Pentium G3258 - 20 MB L3 cache
Real Pentium G3258 - 2 MB L3 cache

Based on Linus Tech Tips' results, the simulated Core i5 4670K and real Core i5 4670K have almost the same performance. However, when it comes to the Pentium G3258, the simulated one was about 10% faster than the real one. This is probably caused by the huge difference in L3 cache size, 20 MB vs 2 MB.


For my tests, here are the L3 cache comparisons between the simulated CPU and real CPU

Simulated Core i5 4690K - 8 MB L3 cache
Real Core i5 4690K - 6 MB L3 cache
Simulated Core i3 4160 - 8 MB L3 cache
Real Core i3 4160 - 3 MB L3 cache

While I don't have a real Core i3 4160, I think the simulated one will be about 5 to 10% faster than the real one. This doesn't really invalidate my conlusion about the Core i3 4160 or any Core i3 Skylake CPU. Why ? Because if the simulated Core i3, which is faster than a real one, is a bottleneck to a GTX 970, then it follows that a real Core i3 would also be a bottleneck to a GTX 970.

By the way, here is an L3 cache comparison test done by DG Lee http://www.iyd.kr/695

Comments (7)

Something to keep in mind. The theoretical unlocked i3 won't be quite that fast, and the actual 4690K will be slightly slower. The 4790K has 8MiB of L3 cache. Even with cores disabled, I believe it stays at that. On the other hand an i3 has only 4MiB of L3 and an i5 only has 6MiB. It's not a huge difference, but it's noticeable. Linus did a comparison of a 5960X and a fake Pentium G3258 (i.e. 5960X using two cores with no hyperthreading) found here. In a follow-up video, he acknowledged the screw up.

Granted, that's a pretty extreme example. It's a difference of a 17MiB and 567% increase vs your 4MiB and 100% increase or 2MiB and 33% increase.
@ CynicalUnicorn

Thanks for sharing that. Yes, the cache affects performance. There's also a gaming benchmark made by DG Lee which shows that going from 2MB L3 cache to 8MB L3 cache boosts performance by ~10%
http://wccftech.com/intel-amd-l3-cache-gaming-benchmarks/

But that just strengthens my stance on the Core i3, it's really a bottleneck to GTX 970 or faster GPU
Nicely done. It really shows that all other things [i.e. components/OC'ing] being equal, the difference between the i5 4790k and the i5 4690k is negligible in regard to gaming. A lot of people seem to want to argue this point. When you are talking a ~1-5 FPS difference, this is well within any margin of error.
@ hapkiman

In Crysis 3 and in the The Witcher 3, Core i7 4790K has a far better 99th percentile frame time than that of Core i5 4690K
i wouldnt consider 13 and a half ms. and 7 ms. to be far better, in the 99th percentile for the i7. if it was twice as fast or more than i would say its a little faster. You done a good job here but i think u need to remember we are talking about ms. timing which is very hard to tell in real world scenarios and the fps is really the big difference u can see with your eyes.
adogg23 I think you have it backwards as the slight changes in ms are very perceptible to the eye. It's what we notice as "stutter" or "jerkiness" which doesn't register in the framerate. Both metrics are noticeable if you have a clue what you are looking at.
Sorry for necroing but you had absolutely irrelevant games for this test: Witcher 3? A console game lmao.

Test some 1 or 2 core games that's what matters. Vanilla WoW, Starcraft 2, CS:GO, etc. Of course i3 will lag in console games.
Overclock.net › Articles › Intel Core I3 Vs Core I5 Vs Core I7 Gaming Performance With Geforce Gtx970