Originally Posted by ToTheSun!
Just out of curiosity, what's that protocol like? Does it involve any sort of persistence, like a CRT would?
The one I did was a simple black/white flicker with a tunable frequency. I can't actually remember the display used, because it was a while ago, but I believe it was LCD-based. No idea on transition time. As for the test imagery itself, brightness, contrast, color, and the actual imagery all play a role, which is why such a test is useful for testing hardware against each other but not particularly useful for determining the upper limit of where the hardware should operate. You get one data point in the spectrum of "Things a human could look at." There's a lot more to it to generate good imagery.
There ARE upper limits to perception, but there are no hard edges. We have truly analog detectors (photon quantum mechanics notwithstanding, it all just turns into shot noise), and we're using them to perceive digital signals. There's bound to be some mismatch.