Originally Posted by Mand12
I will point out though that 4k *is* useless for TVs. They are, actually, far enough away so that you won't be able to see the difference.
How far away do you sit?
From my calculations the absolute maximum a person could use from 9 feet is 200DPI, but that is a little overkill (beyond the point of just smoothing jaggies) so even if we stick at 100DPI, half the absolute maximum that a I can see, a 4K screen should be no smaller than... 44 inches, at 9 feet, that's pretty small. Actually when I do jaggie tests on my laptop (100DPI) screen 9 feet is just where the jaggies start to blur. So for more average seating a 60 inch 4K display at 12 feet is about right.
If you want to avoid aliasing then really you should say that a 4K screen should be no larger
than 60 inches at 12 feet. It's almost backward that everyone seems to want to be able to see each pixel on their screen, lowering image fidelity compared to their normal vision of the world around them.
Once these displays and devices become more common the average person will just love them, and given how close you hold a tablet or phone to your face, I have to agree with the Forbes article that the pixels per degree of 4K mobile devices won't be wasted at all, to the extent that we could probably use 8K tablets as well (10 inch 8K is 800DPI, which viewed at 1 foot is right on target).
I don't like the idea of 8K on a phone though, holding a device 6 inches away from your face just sounds like too much.
On another subject, my bet is that 8K and 4K will be the division between "premium" and "average" content moving forward for the next 50-100 years (until we find better display methods). 4K will be the standard that everyone uses but high end movies and such will shoot for 8K. Right now the Japanese are aiming to have 8K broadcasting ready for the 2020 Olympics.Edited by ILoveHighDPI - 3/7/14 at 9:52am