Originally Posted by Blameless
Those that are seriously looking for 120 fps + as a minimum
frame rate are absolutely a niche market.
If I were looking for such minimum frame rates, I'd be running the absolute fastest 6700k setup I could manage (I'd be shooting for 4.6-4.8GHz and be using the fastest DDR4 I could afford to bin), because I know what I already have isn't quite sufficient.
It doesn't need to be a 120Hz comparison specifically, because most benchmarks published don't feature any sort of frame rate cap.
It's pretty much a given that the CPU that pushes out the most frames in low resolution/CPU bound scenarios is the one that is going to do best for those looking to run high refresh rates.
Back when I was still on CRTs I usually gamed at 120-170Hz. I had a few games that I could run at extreme frame rates, but to be honest, I've always been pretty content with ~60fps, even in those situations where I can percieve a difference between 60 and much higher frame rates. The high refresh rate was simply to avoid flickering, which isn't a problem on LCDs...so I've mostly used 60Hz LCDs since ~2006 when I dumped my last high-end Trinitron.
If you aren't using some form of syncing you will always be experiencing some degree of tearing, and it's generally most noticeable at low frame rates or those close to your refresh rate. Vastly higher frame rates aren't so bad, because the frames aren't likely to be different enough from each other to be really distracting.
I generally don't run any sort of vsync or frame rate cap; tearing doesn't bother me as much as the added input lag and potential hitching that using vsync often entails...though such latency decreases as frame rate increases, so if I did have a 120Hz+ display and a setup that could consistently push 120fps in the games I was playing, I'd probably use vsync in most titles.
I'll be honest. I'm too tired to quote as nicely as you, so I'll just go down the points.
Sure, 120FPS as a minimum, yes, niche. But, even 60FPS feels "laggy" on a 120Hz monitor. That's the entire reason I commented, that it was an "academic" distinction if it was more than 60FPS.
Sure. I didn't mention anything about your setup (or wait, did I? I didn't mean to); I don't know your needs or preferences for games.
True, they usually don't cap the frames. But, here's the issue: many benchmarks are done in GPU-bound situations. That may be intentional, but I'm surprised not even one review site caters to the 120/144Hz crowd, which is almost always a CPU-bound situation. That's all I'm saying; sure, let 95% of review sites show how this $450 GPU is faster than this $250 GPU. I'm cool with that. But I'd like more reputable testing, even just 5%, for CPU-bound situations.
These GPU-bound situations are exactly where that "RAM doesn't matter" mantra began. It was true for 60Hz, but not for 120Hz.
Err, see, that's what I'm talking about, "It's pretty much a given that the CPU that pushes out the most frames in low resolution/CPU bound scenarios is the one that is going to do best for those looking to run high refresh rates." Two issues: I do not think many PC gamers realize when they have a CPU bottleneck vs a GPU bottleneck. Then, CPU bottlenecks are becoming more common as GPUs target 4K, while most individuals still have lower-end monitors.
I'm not saying you don't know this. You get it, maybe because of your previous experiences with CRTs which was the last time people actually could play around with refresh rates. I'm talking about the general "PC hardware buying populace", where people upgrade to GTX 1080s (on "awesomely fast i5-6600Ks") on 144Hz 1080p monitors and are curious why they still get 90FPS.
Sure, we could tell them, "upgrade to a 1440p monitor", but I think that presumes they prefer resolution over motion clarity.
Sure, but I think you're possibly in a minority if you've actually experienced both 60Hz and 120Hz, yet prefer 60Hz. See this article
It became clear pretty fast during the event that most people were able to see the difference between 60 Hz and 120 Hz monitors. In the course of the afternoon there were enough participants that came to the conclusion on their own that it was high time to invest in a 120 Hz display. More than a few only needed a few seconds to say whether they preferred this or the previous monitor.
In the end, 43 out of the 50 participants (86%) indicated in the questionnaire that they preferred gaming on a 120 Hz monitor. Several of the seven gamers that said otherwise used the justification that they had achieved a higher kill-ratio on the 60 Hz screen. The people who did prefer 120 Hz, tended to call the experience smoother and more fluid. Many also noticed fewer instances of tearing, described in a number of different ways.
Nothing wrong with you preferring 60Hz, though. I trust your came to that conclusion from your own experiences as you've actually used 120Hz.
Right, tearing is less common, agreed. I run a frame-rate cap, though, even at 120FPS @ 120Hz because, IIRC, V-Sync limits a few other things (like mouse polling rate? I feel like I read that somewhere) to 120Hz.