Originally Posted by bombastinator
everything makes sense but that bit. The world is 60hz for humans. that's just how fast our brains process stuff. If it's invisible, it's invisible. the argument sounds more epistemological than scientific.
Human eyes don't process in frames.
However, humans see several side effects of finite frame rates.
Let cover a few things:The wagon wheel effect:
A wagon wheel spinning 400 spoke-angles a second, would look stationary under a xenon strobe at 400Hz, If the wheel has 8 spokes, and is spinning at 50 times a second (400 spokes pass the same position), the wheel looks stationary because the image was strobed in exactly the same rotational configuration.The effect of persistence and sample-and-hold
See the animation of eye-tracking based motion blur: www.testufo.com/eyetracking
View this on a non-strobed LCD display. The motion blur you see is because your eyes are in a different position at the beginning of a refresh than at the end of a refresh. This causes static frames to be blurred across your vision. Even at 120Hz, an object moving at 960 pixels per second on an LCD, moves at 8 pixels between frames. That creates 8 pixels of motion blurring on a flicker free display. Double that to 240Hz, a moving object at 960 pixels/second still creates 4 pixels of motion blur because of this effect.Mathematicaly, 1ms of persistence = 1 pixel of motion blur during 1000 pixels/second
(Assumes squarewave persistence ala strobe backlight / OLED / anything that turns on/off nearly instantly)For flicker free displays
, minimum persistence always refresh length (regardless of how fast GtG is).
960pix/sec moving object on flickerfree 480fps@480Hz = 2 pixel of motion blur (960 / 480 = 2)
1920pix/sec moving object on flickerfree 240fps@240Hz = 8 pixel of motion blur (1920 / 240 = 8)
480pix/sec moving object on flickerfree 120fps@120Hz = 4 pixels of motion blur (480 / 120 = 4)For light-modulated / strobed / scanned displays
The visible flash length defines the motion blur instead of the refresh rate:
1000pix/sec using a 4ms squarewave strobe (at any refresh rate) = 4 pixels of motion blur
2000pix/sec using a 6ms squarewave strobe (at any refresh rate) = 12 pixels of motion blur
4000pix/sec using a 2ms squarewave strobe (at any refresh rate) = 8 pixels of motion blurNote: It's mathematically simple for squarewave strobes. The visible frame time period is the time period of eye-tracking opportunity to blur the static frame across your retinas as your eyes continually track moving objects. CRT/plasma/etc use phosphor which gradually decays in a curve rather than squarewave, which mathematically complicates things, but the effect is essentially the same -- it dictates ghosting/motion blur up to the point where the phosphor is too faint to be dominant. That's why slow-fading phosphor has lots of blur (radar displays -- with that spinning green line), while fast-fading phosphor has less blur/ghosting.
Vision researchers have been able to see visible continued human benefits of going from 60fps -> 120fps -> 240fps -> 480fps -> beyond, because of such indirect effects such as wagonwheel artifact issues as well as motion blur issues (for continuous-light-output displays, displays that are flickerfree even under high speed camera). You can add artificial motion blur to eliminate wagonwheel artifacts, but some of us hate adding artificial GPU motion blur effects. (However, it could be practical when you only need to add a ultra tiny amount on an ultrahigh refresh rate display, to fix any remaining stroboscopic / wagonwheel effects).Educational Motion Animations
-- 30fps versus 60fps. Observe that 120fps only has 50% less motion blur on 60Hz LCD's.
-- Demonstration of a 50%:50% black frame insertion to reduce motion blur by 50%
-- Motion blur caused by persistence, rather than GtG. Only fixed via strobing or ultrahigh framerates.
Therefore, as you can see, the benefits of going far beyond 60Hz, still has plenty of human-visible effects.TRUE:
"Humans cannot see *individual* frames beyond ~30fps" (you can't count the frames anymore, but that doesn't mean you can't SEE other benefits)FALSE:
"Humans can't tell apart 480fps@480Hz and 960fps@960Hz for certain kinds of motion effects."
The above two are two completely
There are always stroboscopic-creating side effects or motion-blur-creating side effects, even far beyond 60Hz.
The perfect display never adds motion blur or strobing. 100% continuous light. 100% natural motion blur (from human brain, or as a carefully-used artistic special effect; but never a forced display limitation or a forced strobing band-aid, and never forced always on at all times).
THEREFORE, for the theoretical Holodeck display, you cannot run such a system at only 60Hz or things will look "odd". If you run at only 60Hz, you can never simultaneously solve strobing *and* motion blur. Externally-injected motion blur can easily eliminate stroboscopic effects, but you have to add so much motion blur that it becomes objectionable: It will make a "Holodeck" look off because motion blur is not 100% naturally generated motion blur (inside human brain). Humans can tell apart photographs that were taken using a 1/250sec camera shutter, versus a 1/1000sec camera shutter during fast sports. (The necessary amount of motion blur needed to erase stroboscopic effects, is the same amount -- the length of movement for the duration of a refresh cycle -- so you need to add more motion blur during a 250Hz refresh than during a 1000Hz refresh -- to erase stroboscopic/wagonwheel effects) If you avoid externally generated motion blur, you now have the stroboscopic effect problem come back. e.g. wagonwheel effects, stop-motion effects, dropping effects. Staring at crosshairs and seeing the stroboscopic stepping effect of panning (even at 200Hz). It's all already all scientifically proven. Read the scientific references
, and do your own motion tests (above), too.
Conclusion: Humans will continue to be able to see visible benefits far beyond 60HzEdited by mdrejhon - 11/2/13 at 7:43pm