Originally Posted by phill1978
Just as has been discussed (as per the Vsauce episode on resolution)...
Ugh, he brought up some interesting information, but still missed the boat.
It seems like there's a segment of the population that views it as a tragedy that they can't distinguish between individual pixels on their TV, as though they think that being able to see each one individually is what they paid for, when really sitting "too far away" is necessary to ensure that the information collecting ability of your eye, the most important thing in this whole situation, is actually saturated.
to sit "too far" away for the most realistic image.
Next is the bogus numbers that people throw around for that limit. I've done my own tests with my own eyes and 100DPI at 9 feet doesn't sound unreasonable (I'm somewhere between 20-20 and 20-15, just above average, and that was confirmed with a driver's exam a few weeks ago). My test was for the highest contrast situation I could come up with, which at first may sound biased, but do you actually want a screen that you know is not good enough under ideal conditions? Not to mention that's just where pixels start to blur together, with the maximum possible usable distance going out to 18 feet.
In other words, people should just use their TV like normal, the main reason people like having a bigger TV is the extra immersion from taking up a larger FOV, which would be no more useful now than it was before.
Thankfully we're actually getting demo units set up for people to look at and decide for themselves rather than using a bunch of useless figures from people trying to push an (anti-4K) agenda.
I can definitely agree that bandwidth is an issue, but you have to start somewhere, and hopefully the Blu-ray consortium gets their 4K spec disks out soon.
Next we need Hollywood to actually care, which may end up being the longest lasting problem.