Join Date: Aug 2011
Location: Within the Milky Way
Overclocking CPUs isn’t necessary anymore.
So, here’s my argument. Processors are getting smarter when it comes to thermal/power management and clockspeeds. That, paired with a good process can allow you to ramp up the clocks more aggressively than before.
Multi-threaded workloads like rendering and encoding can cause pretty high power consumption when overclocked, both on Intel and AMD, and for critical workloads like that, why would you risk crashing your progress for some seconds of saved time? Core counts are getting really high now, and the need to have all of them running at the peak of their capabilities isn’t really feasible from a power/thermal, cost, and noise perspective.
Single-threaded applications do benefit more from overclocking since we’re running into a bottleneck, but turbo functions on processors nowadays are pushing these clockspeeds to the threshold of what they’re capable of out of the box. That said, lightly threaded applications are typically games that spit out a pretty high framerate, but there comes a point where the extra frames don’t do anything for you at all. Some game engines run into an FPS wall or the physics break if the framerate gets too high as well. Your overclock could also increase stuttering due to the instability of some function in the processor.
eSports games aren't that demanding, and the need for overclocking on games like that is even less needed unless you’re running 240Hz+ monitors, but even at that point other things like skill, mouse sensor, ergonomics, and keybindings/macros will be more important. We’re not at the point where you can buy enough skill through components that you can beat pros using regular components.
Monitor resolution and response times also put a buffer on the efficacy of CPU overclocks. As resolution increases, the bottleneck shifts to the graphics card. Some panels can’t even fully utilize the refresh rates they’re rated for because their response times are too slow. I’m referring to VA panels that have response times of a 60Hz panel or worse in the blacks. The high-end IPS gaming monitors with 165Hz refresh rates refresh fast enough to keep up, but achieving 165fps on a 1440p or an ultrawide monitor is something that can only be recognized in undemanding games, and it begs the question if overclocking your processor is even necessary for a game like that.
Back in the day, you actually got noticeable improvements from overclocking as base clockspeeds were really low and turbo boost didn’t ramp up as close to the processor’s threshold as CPUs nowadays do. 50% overclocks were capable, and the performance jump was pretty dramatic. Case in point, my 1680 V2 has a base clock of 3GHz and a single core turbo of 3.9Ghz. It’s running at 4.5GHz all core now, and in some applications, I wouldn’t be able to run them well at stock clocks. Granted, this is more of an extreme use-case as Xeon chips back in the day were clocked pretty low, but even your first gen i7s could reach over 4Ghz from a ~3GHz baseclock which is a sizable improvement. To do the same on today’s processors would require sub-ambient cooling. But it goes to show you how far the fabrication process has come along in the past 5-10 years.
Benchmarking, on the other hand, is a solid reason to overclock your processor, as well as overclocking as a hobby.
So, what do you think? Has overclocking your new-ish processor (let’s say starting with Coffee Lake and Ryzen+) allowed you to do or achieve things you couldn’t without it (benchmarking the obvious outlier here)?
1680 V2 4.5GHz 1.33v | G.Skill Trident X 4X4GB 2400MHz | R4BE | Titan Xp + Morpheus II
Wasabi Mango UHD550 | Rosewill Rise | Thermalright LGM RT
Logitech G600 | G440 | Tt Meka G-Unit Cherry MX Black+Double O-Rings
JBL LSR308s | Temblor T10 | Custom headphones | TEAC-UD501+Passive Preamp | Emotiva BasX A-100
OCZ Vertex 4 256GB | 51TB DAS | SS Prime Ultra Titanium 1KW