Originally Posted by ramicio
We only need blur at low frame rates below the threshold of "reality," which Edison discovered was around 40 FPS.
Right, so that is why video filmed at 60 fps looks just as smooth and unnatural as the interpolated garbage, as you put it.
When was the last time you filmed anything at 60fps, personally? I bet never. I do it constantly, and I assure you I am not putting any effects on the video. But it still looks unnatural as hell to anyone viewing it, even comparing a video of my monitors where I can see my hand, to simply looking at my hand in front of me, the video looks far less natural... why? Because as I move my hand around in front of my face, my brain is adding blurring to compensate for an inability to keep up with the changing input.
With the video, my brain is only seeing 60 fps, instead of the constant stream of input. This is vastly easier to keep track of, and your brain doesn't have any need to add motion blurring. When you don't see any blurring, it looks unnatural.
This is the missing aspect, the piece of the puzzle your brain doesn't have, and the reason why it doesn't apply the blurring to the scene.
This is also why you are right about needing hundreds of fps for the brain to add its own blurring. You need to overload the brain's capacity to process the input in order for blurring to occur. This will never happen at 48, 60, or even 120 fps.
It has long been established that your brain is capable of processing things at speeds a hell of a lot faster than our slow nervous system will ever be capable of reacting to. But with the eyes, the input is continuous, not frame rate limited like your monitor. The screen's image is actually changing too slowly, even at 60 fps.Edited by Masta Squidge - 6/17/13 at 9:55am