Overclock.net › Articles › The Truth About Fps

The Truth About FPS

2nd Revision Purpose:
Now that I am being quoted on the internet and approaching 90,000 views... I find some of my typos and grammar appalling, I have gone back and edited some content and fixed some glaring mistakes. I took feedback from here and Reddit and attempted to appease certain more knowledgeable readers who were definitely correct. There were some ideas, especially with the math, that implied concepts that weren't apparently picked up to readers so I have added an addendum where I thought it would best illustrate implications.

I have also been told this article is "long winded", I forgot to mention I have added [TL;DR] sections, so just CTRL + F for TL;DR and it will navigate you through the article much faster. Each section is now broken into sub-section with spoilers. If you find something that piques your interest, you can then read that section or just suffice for the TL;DR.


The Truth About FPS
I was inspired to write this article due to out right lying from certain developers recently. There really isn't a whole lot to go on when talking about FPS and I found that odd. Why do we care so much about 60 FPS? Personally, more has always been better, but there haven't been very good ways to illustrate this to other people other than showing them the difference. So I sought ways to visually and mathematically describe this issue. These ideas will be tweaked over time. I appreciate criticism and feedback though. We all need a little more education about these frame rate numbers, why more is usually always better, why less can be bad, and more importantly, remove fud.

I am a nobody, don't take anything you read here as a gospel, but I have done research both online and offline to write portions of this article.
Unequivocal Proof You (a Human) Can See Beyond 30 FPS (Click to show)
I took a page from Bill Nye, sometimes you just need to refute complex inconsistencies of logic with a simple easy to understand, yet highly factual, counter-example.

Can you see faster than 30FPS?
Yes.

Don't believe you can see beyond 30 FPS?
One word: Lightning. Average duration: 30 ms (some shorter, some longer). 30 FPS equates to a frame time of 33.333ms. You have seen lightning before, yes?
Visual Demonstrations of 30 vs. 60 FPS (Click to show)
Visual 30/60 FPS
First of all this is a cool visual demonstration (as long as you have at least a 60 Hz monitor) to illustrate one way 30 fps and 60 fps visually differ to a user in real time. If you can't tell the difference in this example below, you are lying to yourself.
http://www.30vs60fps.com/

Here is another that lets you do a variety of FPS.
http://frames-per-second.appspot.com/


This image is from the above website.


Notice the vast amount of visual detail missing due to some motion blur. The object is not only blurred but distended, the physical size is wrong. There is a visual loss of fidelity simply by changing the frame rate to 30 fps from 60. There is a BBC white paper that talks about this effect at the end of this article.
History of Frame Rate In Film - Part One: 24 FPS (Click to show)
Where does the magic 24 FPS in film come from?
The actual number 24 (FPS) came from the averaging 22 and 26 FPS. These two film frame rates were common at movie theatres of the silent film era. Earlier silent films did run as low as 16 FPS. Silent films generally had no set frame rate because it varied during recording due to hand cranks. Fortunately, when sound was introduced in 1926, variations in frame rate altered the audio frequencies, and people found it unacceptable. Humans, apparently, are actually more sensitive to frequency changes in audio than frame rate variations. So the industry came together and a static 24 FPS was agreed upon as the standard for film to maintain a steady video and audio experience. Believe it or not though, at the time both 24 FPS and 48 FPS were already possible in film. In 1926! Now the reason they went with 24 FPS vs. the 48 FPS using the two shutter trick, is simple: it was cheaper. People also seemed comfortable viewing 24 FPS as much as 48 FPS, so they made a business decision to save money, despite Thomas Edison recommending higher.

Edison recommended 48 FPS to be more enjoyable primarily because it had better fluidity in motion and a reduction in eye strain. Watching something at an unnatural frame rate, too high or too low forces us to focus harder. This induces eye strain, eye fatigue, facial muscle fatigue, and general discomfort. We simply "can't get into" whatever is on the screen. Now think about watching this screen. You probably have a nice focused concentration on your monitor with relaxed forehead and jaw. Your setup probably allows for this "extended visual concentration". You probably just now realized you have been staring at your monitor for the last 3 sentences and are now conscious you haven't blinked in a while. You’re welcome. It truly is easier for you to relax at a higher frame rate (as long as its matched by a corresponding refresh rate of course.)

There is another "special" feature about having 24 FPS. Through surveys conducted at the time 24 FPS was determined absolute minimum for a human being to perceive film motion as seamless.

As technology progressed 48 FPS became cheaper and even 72 FPS became feasible, however, they never took a huge foothold because 24 FPS is simply what society was used to. People are resistant to change and there is an "adaptation" period to get used to a higher frame rate of film. A lot of people attribute higher frame rate film to the Soap Opera effect, which Soaps generally run at a higher 60i (30 FPS) on a cheaper camera than the standard 24p (24 FPS) more expensive camera used in movies. Certain modern day directors, Cameron and Jackson, have experimented with 60 FPS and even 48 FPS respectively (who should be commended for that too). However, weren't fully well received. The truth still though is we are not biologically programmed to find 24 FPS more enjoyable. The reason for disagreement on enjoyment of these newer frame rates can be attributed to user's preferences yes, but mostly it is that all of us have been conditioned for decades at a slower speed of film. This has made "changing" or "updating" cinematography to a higher standard nigh impossible. The film community was very split on the Hobbit playing at 48 FPS, that is fine, but a lot of people thoroughly enjoyed it. So why not allow an option for consumers?

[TL;DR] 24 FPS in movies comes from about 1926, the average of two other numbers (22 and 26) required to sync with audio. Humans are no more biologically programmed to find 24 FPS more enjoyable than 48 FPS. After 90+ years though, we are used to it. 24 FPS is also about the bare-minimum frame rate a human needs for motion to appear seamless.

History of Frame Rate In Film - Part Two: 30 FPS, 60 FPS, 1080P (Click to show)
Where does 30 FPS come from?
30 FPS (i.e. 60i) in TV broadcasting was mandated by the FCC in 1941. The 30 FPS in TV is actually just shy of 30, 29.97 FPS. This was essentially due to a glitch between the chroma and sound carriers (they were out of sync). The i next to 60i is actually our friend, interlaced. Interlaced and progressive are now common buzzwords thanks to the HDTV industry. These words should never have been attached to resolution as basically it comes from the fields per frame. When somebody is stating something is broadcast in 1080P, what they are actually stating the resolution is 1920x1080 @ 60 FPS without any over-scanning, under-scanning, or reinterpreting the signal to a lower resolution. The standard was then modified afterwards to benefit 24 FPS (cinema) and 50 FPS (Europe's standards). We almost set the industry standard to 60 FPS!

Where does 60 FPS come from?
The ATSC set the US standards for first to be to the standard: 1080P @ 24 FPS, 1080P @ 25 FPS, 1080P @ 30 FPS, 1080P @ 50 FPS. In 2008, they added 50P and 60P as the high standard for 1080P. The entire TV industry was heavily trying to push a standard: 1080 @ 60 FPS and as per usual, consumer hardware was designed for backwards compatibility first. This means the many claims that video games can be 720P and 30 FPS because the TV/FILM industries aren’t moving forward is hogwash. The mega-billion dollar industry that is TV, itself, has been pushing higher resolutions and frame rates since 2008. Nobody is supposed to even advertise "Broadcasted in 1080P" without displaying 1920x1080, 60P (60 FPS.)
Video Game Frame Rate - Part One: 60 FPS (Click to show)
60 Hz? 60 FPS!
60 FPS basically came about because of flat panel monitors and TVs. Almost all of which operate at 60 Hz. Gaming wise, there is no scientific or industry standard, that games should be 60 FPS other than it is the most common and “cheapest” refresh rate on flat screen monitors/TVs. A stable and natural 60 FPS would be considered the best experience on a 60 Hz monitor.

It's funny to me that the origins of 60 FPS are so obscure. That is partially because there is no overlord of video gaming that mandates industry standards. We have some standards for film, art, software, APIs, but not one single standard for video games, or video game engines. I think there should be one with plenty of clout so we could prevent console ports to PC sucking, establishing graphical quality minimums, minimum frame rates etc. I have always thought about getting the veterans of gaming in an Elders of Gaming Council, and basically if they spoke people listened. My nomination for the council would be the Carmack of course, god knows GabeN doesn't need another HL3 detour.
Video Game Frame Rate - Part Two: 30 FPS (Click to show)
30 is half of 60, and I can count to Potato!
30 FPS in video games, quite literally, has nothing to do with TV standards. It is simply half of 60 FPS. This is a very important distinction to make because TV frame rate gets lumped into the argument all the time.

It is far less taxing on developers and hardware to use 30 FPS. Add certain benefits of being able to use VSYNC and Double Buffering, 30 FPS has always been modern game developers "fall back" frame rate to address performance issues, hardware limitations, bad tearing and frame rate fluctuations. Not to mention any physics or animations that are directly associated with the game engines designed around a static number such as 60 FPS, the frame rate simply has to have a multiplier used with it in calculations.

For example:
CURRENT_FPS * MULTIPLIER = RAG DOLL PHYSICS (horribly gross unrealistic equation, but it represents the idea well: your frame rate unchecked would break physics)
60 * 1 = 60
30 * 2 = 60

The rag doll physics are kept in check by a simple multiplier. When an engine was absolutely planned to always be 60 FPS, funny things happen when you work around it, ala Skyrim physics.
Video Game Frame Rate - Part Three: Refresh Rates vs. FPS (Click to show)
Why are computer monitors mainly 60 Hz?
The quick answer: it is cheap and easy. Monitors have not always been “stuck” to 60 Hz. CRTs operated at a variety of frequencies, some as high as 240 Hz, years before we had our first 120 Hz TVs in the living room. It was this new LCD technology that influenced gaming. CRTs are in fact often considered superior to LCDs by many older gamers.

Flat screen computer monitors exploded in popularity in the early 2000s. The main reason we had 60Hz refresh rates on LCDs was due to the oscillation of US alternating current, 60Hz. It was simply a design choice that was considered good enough. I am not going to get into the technical aspects of monitor refresh rate tied to alternating current, but the origin of the number 60 is important to know in that it isn't important. That means to say, 60 FPS doesn't come from a human perception limitation, or some psychological study, or any scientific experts, or a standardization body... It came from a design/business meeting on how to keep costs down on LCD flat screen monitors as they were becoming popular.

What is Refresh Rate in layman’s terms?
First, it is good to note refresh rate and frame rate are two important but completely different variables. They are independent from one another (usually), but you depend on both of them for you gaming experience.

Okay so we know 60Hz does not equal 60FPS, but they are related. Your FPS is the performance your machine has in a game. Your refresh rate is your visual window into your games frames. An ideal scenario is when your window for viewing and frames per second are equal with little variation or fluctuation. Having a stable 60 FPS naturally occurring at a natural 60 Hz is ultimately the ideal scenario. Vertical Sync or VSYNC is forcing this to occur, but VSYNC has a few other issues that affect/impact the users like input latency (not always but often.) This is a good alternative reason to get upset with developers if you don’t care about the visual inaccuracies. Most of all of our TVs are at least true 60 Hz TVs. Meaning the games “set” to 30 FPS are using VSYNC without an end user choice. You will always have more user input latency with VSYNC than without. This makes games sometimes feel sluggish or slow to your controls.

[TL;DR] 30 FPS in gaming is half of 60 FPS, mostly making math easier and to be the less distorting on a 60 Hz TV. 30 FPS in gaming is not because of film or TV broadcasts or early 1080@24P, 30P, or 60P tech advancements. It does however exist because the "unofficial standard" was set at 60 FPS. Again, that said standard (60 FPS) is due to the mass adoption of LCD flat panels. 60 FPS matches the 60 Hz refresh rate. The 60 Hz refresh rate was actually set by the hardware industry attaching it alternating current (60 Hz). There is no standardization when it comes to video games themselves. Hardware and APIs do have standards but not games or game engines. FPS is not the same thing as refresh rate but they are intimately linked as one is exists as the window to the other. The ideal situation is when your video games naturally match your refresh rates.
Human Biology / Perception (Click to show)
What do we know about human perception?
This isn't a biology/psychology/neurology post, but I will post some important scientific information that will aid in the separation of conscious reaction times vs. sub-conscious processing. The eye is constantly streaming in information to the brain that the brain is constantly processing all of it. The eye itself can detect color pigment in approximately 200 femtoseconds. The brain and the eye are biologically engineered marvels, each in their own right.

Unfortunately many people mistakenly confuse conscious actions with subconscious processing, and use this as grounds for pro-lower FPS.
Example Argument.) Because I can move a mouse only so fast my brain is only communicating so fast.

Our conscious mind, which can still operates in the realm of milliseconds, does in fact operate much slower than the physiology of the brain. We have certain bottlenecks processing conscious thoughts and activities. There are no such bottlenecks for a healthy brain just doing its processing.

Human response times include this "conscious" mind bottleneck. They still range as low as 50 ms to anything higher really. A lot of people use this argument to prove that 30 FPS is fast enough because it's technically faster than normal human response times (33.33 ms). That's not how it works though.

Other important things to know, it would take years for me to cover all of the neurology, so let me just re-iterate it is much more complicated than I make it to be. There are things such as motion interpolation in devices and our image persistence. I just want you to lean more into the idea that the human brain is more than capable of viewing image transitions at and beyond 30/60 FPS.

Revision Addendum:
These next few examples were intended to be exaggerated, yet mathematically manageable examples, to show end-users there is complexity in frame rates and benefits to higher frame rates. While I was reading some of the feedback to the article, some users mentioned that the reaction times were insanely unrealistic. This is true. Very true. Image persistence was mentioned as well, definitely a factor for humans differentiating the images we are seeing.

What I intended during this section was for the reader, who may not know much about FPS, would understand that higher frame rates better visually represent what is going on in the game engine, therefore allowing us to react faster to the issue at hand. Subsequently, higher frame rates also provide us with more image fidelity in motion by showing us more frames during transitions allowing us to be more accurate during our responses.
Human Perception Example: Perfect (Click to show)
Let's Play an FPS:
Conscious: I see movement, I determine it's an enemy. It could take as little as 50 ms.
Conscious: The motion to point your gun at someone. It could take as little as 50 ms.
Conscious: Click my mouse. It could take as little as 50 ms.
The combined actions above took 150ms for a complete response (example.)
Conscious: Track targets adjusted movement, firing into the targets intended path could take as little as 150 ms (Re-aiming, clicking, consciously determine targets direction).
Here we see a variety of scientifically measurable pieces of data of an extremely fast player, possibly a StarCraft II fan, playing Call of Duty.
Human Perception Example: Perfect - Analysis (Click to show)
All of those conscious thoughts and actions occur based off some of the following (semi-measurable pieces of information below.)
Sub-conscious: Brain registering motion with your eyes could occur around 3.33 ms (humans can detect flicker, color changes, shape changes still at 300 FPS.)
Sub-conscious: Eyes registering a simple color pigment may occur around 2.0 × 10 ^ (-10) milliseconds (200 femtoseconds.)

Check out this human benchmark!
http://www.humanbenchmark.com/tests/reactiontime/
Also check out 9,436,379 test result statistics:
http://www.humanbenchmark.com/tests/reactiontime/stats.php

Note: There are not many numerical facts regarding the intricacies of the brain, I mean how does one measure the reaction time of the sub-conscious brain? Thankfully though, through oddities, neural mapping, neural imaging, and thankfully high frame rate cameras/computers, we have measurable pieces of data to at least differentiate our physical responses to conscious responses. We might not have the full picture, but we know one is faster than the other.

Truly then, that 33.33 ms is technically the delay on the image reaching your brain from the game engine. Since the human mind is able to use and tell the difference of frames down to 3.33ms, 30 FPS is a factor of 10 times slower. It's good to point out that this is also where we start to see diminishing returns as well. Manufacturing cost vs. end results, there does appear to be a good enough (for now) set of numbers. The brain can detect differences at 3.33 ms, but it’s really expensive (and unrealistic on today's hardware) to get 300 FPS out of a game. Not to mention there isn’t a true 300 Hz monitor (that I know of) to actually perceive 300 FPS naturally. If you look at 200 FPS, 5 ms, it is only 50% slower at 2/3 the frame rate. 200 FPS is still unrealistic both in gaming and in hardware. Let’s look at the current high end LCD refresh rate 144 Hz. 144 FPS, it’s just over 100% slower than 300 FPS and just under 2 ms (6.94 ms) slower than 200 FPS. We have the hardware and game engines capable of rendering it, it sounds good, let’s aim for that! Well we are not. It is 2014, and due to design choices in hardware, their target constantly ends up being 1920x1080 (or less) @ 30 FPS. We are not even aiming for the 1080P standard anymore, which was established in 2008.
Human Perception Example: More Realistic (Click to show)
So let's re-look at the above conscious action scenario at 30 FPS:
Game Engine Latency: The game has a 33ms frame time, meaning that the "game world" has a latency of 33.33ms from calculation to display.
Display: There is minor overhead from monitor displaying the frame as well (~3 ms).
Image to Eye to Brain: Lets call this zero.
Conscious: I see movement, I determine it's an enemy. It could take as little as 50 ms.
Game Engine Latency: The game has a 33ms frame time.
Display: There is minor overhead from monitor displaying the frame as well (~3 ms).
Conscious: The motion to point your gun at someone. It could take as little as 50 ms.
Game Engine Latency: The game has a 33ms frame time.
Display: There is minor overhead from monitor displaying the frame as well (~3 ms).
Conscious: Click my mouse. It could take as little as 50 ms.

So with instantaneous information from the game world, no latency and infinite FPS, we were just limited by the human reaction. In the real world though, we have to add 36ms between each instance to allow for three frames we made a judgement call to aim and shoot. Obviously this is more of a complete reaction shot and you would probably base your trajectory of aiming over more than 3 points of data. This is just to illustrate we see first then react and an example of how big of a deal 30 FPS is from say 120 FPS or even 144 FPS.

30 FPS: 150 ms (conscious reaction time) + 3x 36 ms (33 + 3) = 150 + 108 = 258 ms.
60 FPS: 150 ms + (3x 19.667) = 150 + 53 = 203 ms
90 FPS: 150 ms + (3x 14.111) = 150 + 42.33 = 192.33 ms
120 FPS: 150 ms + (3x 11.333) = 150 + 34 = 184 ms
144 FPS: 150 ms + (3x 9.9444) = 150 + 29.8333 = 179.833 ms
200 FPS: 150 ms + (3x 8) = 150 + 24 = 174 ms
300 FPS: 150 ms + (3x 6.333) = 150 + 19 = 169 ms

As you can see it's in the realm of holding back this player, 1/10 of a second slower can be extremely noticeable and that's on every interaction with the game world.
Human Perception Example: More Realistic (Alternative Analysis) (Click to show)
An alternative calculation to explain the difference of data perceived is keeping the total time fixed of 258 ms, and see how many new/extra data points you visually get in the same time you only get 3 data points from 30 FPS.

30 FPS: 150 ms (conscious reaction time) + 3x 36 ms (33 + 3) = 150 + 108 = 258 ms, 3 total frames viewed.
60 FPS: 258 - 203 ms = 55 / 19.667 = 2.8 extra frames
90 FPS: 258 - 192.33 ms = 65.67 / 14.111 = 4.65 extra frames
120 FPS: 258 - 184 = 74 / 11.333 = 6.53 extra frames
144 FPS: 258 - 179.833 = 78.167 / 9.9444 = 7.86 extra frames
200 FPS: 258 - 174 = 84 / 8 = 10 .5 extra frames
300 FPS: 258 - 169 = 89 / 6.33 = 14.05 extra frames

30 FPS = 3 frames viewed
300 FPS = 17.05 frames viewed in the same span.

While it isn't 10 times more data (which is true only on paper), you are still getting nearly 6x the visual data at 300 FPS then you do at 30 FPS.

That is the difference between this:


And This:


Now imagine that those dots represented points of firing a gun visually. One image lets you see the inaccuracies of your aim, therefore adjust, the other appears perfect when it is not, you are missing targets because you can't really see where you are aiming.

Another Example Of Data Processed In a given second:
30 FPS gives you 30 frames.
60 FPS gives you 60 frames.
120 FPS gives you 120 frames.

The same scene that was played in all three events, the gamer at 120 FPS gets 4 times as much visual processing data but more importantly roughly four times the seamless motion. The humans mind analogously sees seamless motion very similarly to a Riemann's Sum. The more points of data you have allows your brain to have a more accurate the representation of bullet trajectory, 3D depth perception, motion prediction.

At 30 FPS you get 30 points of data. At 120 FPS you get 120 points. You literally get 4 times the accuracy of the in game image when motion is involved.

If you want to get an example of a Riemann Sum with more or less data points check out:
http://demonstrations.wolfram.com/RiemannSums/
Human Perception: Meaning, Conversation, and Conclusion (Click to show)
What should I take away from this?
Well, there is no physical limitation of the brain to keep frame rates to any specific number, 60 FPS included. In fact, evidence would state that the faster the information is physically processed, the faster the information can be consciously processed and subsequently reacted to. The eyes constantly stream petabytes of information to the brain easily. We consciously adjust our thoughts and motion according to the data stream. All of this "cinematic" appeal is crap primarily because cinematic frame rate was established 90 years ago. If anything we are just conditioned to the low frame rate film. Furthermore, film frame rates do not equate to video game frame rates. You aren't just sitting down watching your game take place, you are interacting and reacting to the feel of the world and you are truly handicapped at low frame rates.

Well I play just fine at 30 FPS.
This is not an argument. This is a statement that is not true or false but qualitative to your experience. Statistically non-casual gamers play games more skillfully at higher resolutions, with higher fidelity, but mostly with higher FPS. Just ask any PC gamer if he would rather game at 30, 60, or 120 FPS (all with identical in-game quality) with a matching refresh rate monitor. Majority would pick the 120 FPS @ 120 Hz. It's definitively smoother both quantitatively and qualitatively.

Well I don't care about accuracy, I just want to play.
It won't phase you either way if the majority of gamers get what we want, because we have been advocating the choice of higher frame rates. Also you really can't complain, or get frustrated, when you swear you shot the AI in the head but it obliviously and frustratingly missed.

This is just a 30 FPS to 60 FPS example of image distortion from the inaccuracy of motion under frame rates:

[TL;DR] You are missing out on a wealth of visual processing the brain can take advantage of up to 300 FPS and beyond. Human response time does not equal brain processing time. 30 FPS generally gives us blurry images in motion. If you really don't care, then please avoid siding with the pro-lower FPS, or if you do, fully realize what you are claiming. There are those of us who want the higher standards, the higher frame rates, and essentially more bang for our buck and from our tech.

[TL;DR]Article Recap
  • 24 FPS comes from 1926 silent film era, even though Thomas Edison recommended 48 FPS through science. 24 FPS was just cheaper.
  • 30 FPS in TV comes from a 1941 TV broadcast NTSC standard mandated by the FCC. 30 FPS in gaming comes from the advent use of designing a gaming experience at 60 FPS but hardware wasn’t up to snuff, so 30 FPS was the fall back.
  • 60 FPS comes from matching to refresh rates on LCDs (60Hz) in the early 2000s. The best gaming experience is when the frame rate matches the refresh rate. LCDs were new but extremely popular. The LCDs have refresh rates of 60 Hz to match US alternating current (AC.)
  • The true 1080P certification is literally 1920x1080@60p.
  • Motion image accuracy is tied directly to the frame rate.
  • Motion prediction is directly tied with the fluidity or the seamless transition between frames.
  • Many gamers would be satisfied having the option to choose the frame rate (if the hardware was even capable of rendering at higher FPS).

The Ongoing Developer Discussion:
There is still more to discuss though. It is true that sometimes a higher FPS can give us a headache or motion sickness, but it’s also the same reason you can get one from watching FPS when it is too low. We are conditioned to perceive it a certain way, and those who know the truth know there is no shame in admitting that higher frame rates will appear odd at first. It can often feel surreal, but let's be absolutely clear, it is not because of some limitation of the brain struggling with a higher FPS.

I have personally argued the point that setting the cap to 30FPS is a cop out, which needs to be owned as one, and game developers/publishers take responsibility for it. Some developers claim it's more "cinematic" which I have proven is a lie or ignorance. Other developers are clearly more honest:
Quote:
Originally Posted by Paul Rustchynsky, Drive Club Director 
"We chose a locked frame-rate for this very reason and with 30fps we don't have to hold back any of the obsessive visual detail."
Quote:
Originally Posted by David Polfeldt, Massive Entertainment's Managing Director 
Comments About The Division being only 30 FPS.

I think we're shooting for 30fps because it's a trade-off, right? Graphical fidelity and immersion are more important to us than the frame rate. If we go for [60fps], we'll have to make a trade-off on fidelity and other things. But because we want to have very, very complex destruction and extremely detailed environments; a complete weather system, full day/night cycle...at some point you have to make up your mind: where do you invest? And for us, it's going to be 30 FPS.
Response To Drive Club Director (with Math) (Click to show)
Admitting that even PS4's hardware was not capable to run the game at the quality they wanted with the given frame rate we have come to expect (60 FPS.) It is honesty, which is respectable, but at 30 FPS in a racing game, I believe it is a bad choice and here is why.

A car traveling 120 MPH, which is 2 MPM, that is 0.03333 MPS.
Metric Conversion: 0.0535645 kilometers per second, KmPS, or 53.644 meters a second.

Having 60 Frames Per Second, means the distance traveled between each frame, is 53.644 meters per second / 60 frames per second, gives us 0.8941 meters / frame. That in itself means that between each frame, you travel a distance of about a 0.9 meters. Doesn't sound very smooth to you does it? Wait till you see what 30 FPS does in this same situation. 53.644 meters per second / 30 frames per second = 1.7881 meters per frame. That is difference between a successful over take and crashing into a barricade. How is that lack of accuracy better for gamers?
CAUTION - We Just Entered the Difference of Film and Gaming! (Click to show)
Now I want you to imagine projectile physics. Bullets travel faster than cars right?

The M4A1 has a muzzle velocity of 3020 ft/s. 920.496 m/s.

920.496 m/s / 60 fps = 15.3416 m / frame
920.496 m/s / 30 fps = 30.6832 m / frame

Wow big numbers right? Luckily, it has nothing to do with accuracy! While it is really travelling that distance per frame, you are not in control once you have shot therefore there is no real margin for error because even with bullet drop, the path is predetermined once fired. This example was to clearly illustrate, its the decisions we take based off of the visual processing vs. the motion of in game objects. Your brain is what makes you crash into the barricade, not the car being in the wrong place.

Now we adapt, get better, get "used" to these settings, that much is true. Just like Edison claimed 48 FPS is easier for us to process film, 60 FPS is even easier on the mind to perceive a more accurate image for you to interact with. So its easier on your brain/eyes visually and easier for you to be better at the game.

This is clearly the difference where frame rate affects game play. IF your brain is making a decision (reaction) to visual motion, then frame rate affects the accuracy of that decision. THIS is the difference between FILM and Video Games. Sitting on the couch watching a movie and eating Cheetos, 24 FPS will suffice all day long. Driving a car? Absolutely terrible experience both quantitatively and qualitatively as I have tried to show.
More Eye Rolling Dev Quotes! (Click to show)
Thank You For My Material Internet:
Quote:
Originally Posted by Dana Jan, game director on The Order: 1886, is committed to 30 fps 
60 fps is really responsive and really cool. I enjoy playing games in 60 fps," Jan told me. "But one thing that really changes is the aesthetic of the game in 60 fps. We're going for this filmic look, so one thing that we knew immediately was films run at 24 fps. We're gonna run at 30 because 24 fps does not feel good to play. So there's one concession in terms of making it aesthetically pleasing, because it just has to feel good to play.

There is so much... just bizarre about this. Okay you know movies operate at 24 FPS, but failed to look up that it is the absolute bare minimum from 1926. The year is 2014 in case anyone has forgotten. There is no such thing as a "filmic" look created by frame rates for video game. Just as we determined there is a difference between a bullet and a car, there is a difference between a film and a video game. Suddenly 24 FPS doesn't feel good, but 30 FPS feels great? Or is it admitting to be another bare minimum to work with console based VSYNC technologies?

Quote:
"Higher framerate doesn't equate to better," Weerasuriya insisted. "The framerate has to satisfy the experience you want to have."
Source: http://kotaku.com/the-order-1886-is-30-frames-per-second-and-darn-proud-1524644901

This is just a bold faced lie. What this person probably meant to say was that subjectively, high frame rate doesn't make a game good, but the converse is true, low frame rates ruin games. Ask any gamer.

Source: http://kotaku.com/a-developers-defense-of-30-frames-per-second-1580194683

Where is the downside to running at least 60 FPS?
To users? Really nothing. It is more expensive for developers to optimize the games to get that out of the hardware. The hardware in both the PS4 and Xbox One (both CPU and GPU) were predicted to struggle to render high quality games and textures and at a solid 60 FPS. Nobody wants to come out and say this directly in an official capacity of course. It is true that more can be done with this hardware in the future, and we will see it. This won't be anything like that last generation leaps in fidelity. These are not unknown architectures or exotic configurations, meaning there isn't magical performance to unlock as developers gain experience. We will see it in optimizing game development though. Knowing what works and what's pushing the envelope a little too much. These two boxes are extremely close to PCs and developers have tons of experience with both PC game development and the AMD hardware powering both boxes.
Research Articles (Click to show)
I will add these as found.

White Paper from BBC R&D about 300 FPS
http://downloads.bbc.co.uk/rd/pubs/whp/whp-pdf-files/WHP169.pdf
Quote:
Originally Posted by BBC's Conclusion 
The spatial resolution of broadcast television cameras and displays has reached the point where
the temporal resolution afforded by current frame rates has become a significant limitation,
particularly for fast moving genres such as sport.
BBC Research has successfully demonstrated
that increasing the frame rate can significantly improve the portrayal of motion even at standard
definition
. If the spatial resolution of television standards continues to increase, raising the frame
rate to maintain the balance between static and dynamic resolution will only become more
important. Even at the spatial resolutions of SD and HDTV, the motion artefacts associated with
50/60Hz screen refresh rates will become increasingly apparent as television display sizes
continue to grow.

Even for television pictures transmitted and displayed at conventional frame rates, capturing at
high frame-rates can offer some improvement to picture quality through temporal oversampling,
giving better control over temporal aliasing artefacts and offering a choice of “looks” to the director
at the post-production stage.
It also offers improved compatibility with the different conventional
frame rates adopted internationally.

We assert that a higher capture and display frame rate leads to a step change in picture quality
regardless of the spatial resolution.

Not only is it going to greatly reflect true motion in image capturing, but even the post production and motion blur techniques are greatly enhanced by having access to those extra frames.

Comments (45)

Good article. the history was very interesting.
Thank you for taking time to write it.

The processing power of the human mind is fascinating.
Our mind makes adjustments and fixes what we see.
Higher frame rates are less fatiguing, because our minds have to fix and adjust less.
I think the whole "human eye can only see 30fps" myth comes from the "30fps is the minimum required for something to look like motion" but then people can't read and they think "minimum required for motion" is the same as "you can't see more frames than it". No one is conditioned to anything, they just don't know how to read and they just repeat untrue "facts".
I have updated the article slightly, mostly additions. Less rushed today on the editing. Still working on typos as time lets up.
In a similar discussion on another forum I posted about this, and my thoughts are the main differences between film fps and game fps are film is pre-rendered/filmed with natural motion blur and effects, as well as being linear and not actively controlled by the viewer (as you mentioned, 'motion image accuracy is tied directly to the frame rate').

In games the viewer needs immediate control over the camera angle at any given moment and there naturally needs to be an increased frame rate to refresh the view. I see them as two different mediums with different viewer considerations, and obviously the 24/30 fps argument doesn't fit as well for games for those reasons.
Yes, I sort of touch on that in the bullet / car distance per frame. One is a predetermined path, basically, film, once fired. The other one, the car, is directly affected by the frame rate due to the visual processing the brain is compensating for, making accurate turns and overtakes more difficult due to the immense pieces of data humbly missing between frame rates. If you were just watching, you are unable to make a decision off of bad information.

It's really as simple as that.
Extremely well structured discussion of how fps works, nothing to say except for, thank you for taking a truly logical approach to this argument! It was really nice to read and I like how you brought in some physiological elements to aid what you were saying. Given your evidence, you've certainly convinced me to side with you, I was always unsure how frames really effected accuracy and if 60fps was "better" or more beneficial and I think this answered all of my questions

Thanks mate!
Thanks guys, keep this article in mind, that it is living (I will be adding more). Next up, I will be addressing GPU framerates vs input latency not just visual inaccuracies. I will also correct more grammar and typos as I go along!
Very interesting article. Educational and informative.
This is a good article but I think you missed one point:
The image on screen is (1000/FPS) ms behind the game engine. E.g. for 30FPS: The computer updates the game world (10ms), renders the frame (10ms), waits until the monitor is ready to display the frame (13ms) and then the frame is displayed on the monitor (even the monitor has lag, a low time would be 5ms to change all the pixels from one frame to the next). At this point the player can see the new frame (or rather, see the differences between old and new frames), and it has taken 33ms to get here.

This is/was very noticealbe on BF4 with it's 10Hz tickrate, it meant that (again, assuming 30FPS) what the player could see was actually frame time + tick time or (1000/30)+(1000/10)= 133ms. 133ms is, as you mentioned above, more than enough time for a player to react to the new information.
*This is/was very noticealbe on BF4 with it's 10Hz tickrate, it meant that (again, assuming 30FPS) what the player could see was actually frame time + tick time or (1000/30)+(1000/10)= 133ms behind the game world on the server. 133ms is, as you mentioned above, more than enough time for a player to react to the new information.

[EDIT]
Clarification
You are right BruceB, it is simplified. Tick rates of BF4 add a whole other layer of complexity, hence their netcode hit detection fix being... psychotically hard. Don't forget the latency to and from the server as well

It's very good to analyze a single player events rather than multiplayer, but to be clear, multiplayer clearly benefits the most from high framerates!
I was just using BF4 as an extreme example, I just wanted to say that with lower FPS, by the time you see something on screen it can be too late to do anything!

P.S. A bit off topic: They didn't change the hit detection or any 'netcode' in BF4, all they did was up the tick rate! Just goes to show what a bit of latency between the game world and the image on your screen can do!
BRAVO GOOD SIR BRAVO!

This article brings UBISOFT to mind, as they seem to be actively trying to dumb down PC graphics & frame rates to make console look better/good by comparison. It's kinda like when the not so attractive girl hangs out with an extremely unattractive girl to look more attractive by contrast.
I learned some things, I was inspired to look some things up, and I applaud you for being the cause of this. Well written.
Thank you gentlemen. I try to give that impression that I am not a know it all, I did the research concurrently writing it because I was missing pieces of information myself, and need good ways via static images to display the concept of "image accuracy" under motion, as well as give users a way to experience it themselves.
I have added to the article including more developer feed back, a few unintended paths of logic that I wrote incorrectly, I also added two visual latency effects to the "Let's Play A FPS" section.
Good job man, you took me to back school and i actually payed attention lol. I got to say the only people making a case for low FPS is some corporation CEO, defiantly not a gamer and DEFIANTLY NOT ATI or Nvidia you know thay don't want us dusting off those 8800gt's hehe.
Thank you so much for this great article! I wish there's a way to bookmark this into my account or something. I've always had people argue with me when I tell them I DO notice the difference between 30 and 60 fps even with MMORPG's, but they think I'm just obsessed with fps and the obsession is creating a placebo effect. Anyway, thanks again!
Overclock.net › Articles › The Truth About Fps