Originally Posted by superstition222
8K for television and home movie viewing is ridiculous.
Not only is it ridiculous from the point of view of human vision
, it's ridiculous in terms of:
A) Not needing to see extra detail from people's nose hairs, wrinkles, and moles
B) Ridiculous bandwidth requirements
. Good luck watching even plain non-HD Netflix on capped broadband without paying ridiculous amounts. Where I live I have the choice between capped satellite and cellular. Don't ask how much 30 gigabytes per month costs, let alone the average of 60 we were using at our old place when we had cable. That was with very little Netflix, practically no HD Youtube, etc.
C) Ridiculous encoding times
D) Ridiculous file sizes
16K is needed for perfect VR because the screen is extremely close to the eyeballs. For television 8K is not needed. There are things that are
needed, like wide color gamut, high contrast, lack of motion blur, perfect uniformity, low-as-possible input lag, lack of reflections, long phosphor life, high-quality audio, etc.
Shrinking OLED pixels for 8K is going to reduce lifespan even more, too. Corporations want everyone to be convinced that they need 8K because they've tried to speed up planned obsolescence with all this pixel-pushing — to hysterical levels. We had standard definition forever but, suddenly, we need a new television set every few years. Instead of just making HD 720 then 1440 we had the stupid 1080 standard, which had resolution that was too low. Now that we are getting 4K, which is more than enough for television and home film viewing, we're supposed to need double that. BS.
People should just use the Blur Busters aliasing test to determine how much resolution they really need.
If you can see any aliasing on your monitor that means your eyes are resolving more detail than your monitor provides.
All of the classic visual acuity tests only account for part of your visual system.
You can see up to 10x more detail than the technical limits of "visual acuity".
There certainly is a point of diminishing returns, but 4K isn't it, chances are 90% of people are going to be able to fully resolve high contrast detail on a 4K screen with the same field of view as what they're using now.
8K will be getting closer to the limits for a lot of people, but certainly not everyone and probably not even the majority.
If I'm going to have a monitor with enough pixel density to just barely eliminate "static" aliasing (not moving), it needs to be a 30" 8K monitor.
Really all the people who measure these things according to the minimum
required resolution are doing it backwards, if you want "perfect" image quality then you want to be absolutely sure that your display is giving you more resolution than you can see.
I should be asking for a 27" 8K monitor.
As for movie applications, apparently IMAX Film is equivalent to 12K resolution (https://en.wikipedia.org/wiki/IMAX
), and all you need to do in order to achieve the same field of view as an IMAX screen (according to the original specificatios: http://www.lfexaminer.com/20090522a.htm
) is have the farthest viewing distance equal to the screen width, so if you have an 82" screen and you sit six feet away, you've got the same horizontal field of view as IMAX.
Bring on the 12K TV's.Edited by ILoveHighDPI - 4/26/17 at 1:59am