Originally Posted by superstition222
Quote:Originally Posted by Zen00
21.6" 4K? Can you even see the desktop icons at that small?
The point is to have pixel pitch well beyond diminishing returns, so everyone can pretend they have 20/9 vision.
Jaggies are easily visble at pixel densities far beyond this.
Human vision has different sensitivities to different types of patterns. “Edge Recognition” is one of the specialities of your eyes, but high frequency visual data overwhelms the photocell structure relatively quickly.
The way you “See” is a heavily processed image that uses motion and data gathered over time, from two sensors, to produce the image that pops up in your head, but all the HDTV marketing trying to say what pixel density is “Worth It” for the last 20 years has relied on a high frequency line pattern test that specifically defeats all of the mechanisms your eyes use to perceive fine detail.
It was effective marketing to sell corporate bigwigs on enormous upgrade budgets, and sell consumers on the new “High Definition” TV format that was supposed to be the best thing possible, but fundamentally anyone who has studied how your eyes work would have known very well that the marketing was misleading.
How people should practically decide on the value of increased pixel density is on a scale of diminishing returns.
Below 60 Pixels Per Degree you’re perceiving 100% of the detail, above that you start to gradually lose perception of some details that are too low in contrast or too high in frequency. Where a person with 20/20 vision will actually hit the theoretical wall of zero gain from higher density is around 300 Pixels Per Degree.
In general I base my numbers off the data here: https://michaelbach.de/ot/lum-hyperacuity/index.html
Or you can extrapolate what your highest posssible perceivable resolution is by looking at a lower resolution display from a greater distance than normal (which for most people gives results a bit above 300 Pixels Per Degree).