Originally Posted by serp777
So a lot of people on here always says moar pixels are better, but at some point soon the human eye will reach a limit. I'd argue that you only notice half the improvement each time you double pixel density.
So jumping up from 2k to 4k gives you twice as much percieved improvement compared to 4k to 8k assuming an similar area. Same thing for 16k or 32k.
1080p on my desktop already makes movies and games look real basically and although absurb amounts of resolution is cool for novelty, it doesnt really offer substantial aesthetic or pratcical benefits . it simply requires more processing power, which needs more electricity, etc. My point is that it isn't worth the electricity increase. I'd rather just have higher power saving than moar pixel density beyond 1080p.
As much as I love when people talk about the possibility of screen resolutions higher than 8K, I actually think we should all stop there. While it's bizarre that all the "experts" seem to think that 1080p is enough to match human vision, I definitely think that 16K is overkill unless you have a wall sized screen.
I'm basing this off my own visual acuity test, which I encourage other people to do so that we all have a better idea of how far to push.
I'm using a 100dpi monitor right now, at 18 feet away I can still see a single white pixel on a black background in a dark room. At 20 feet, the pixel just blends in with the signal noise in my eyes. AKA, I can't see it.
That's my baseline for the absolute maximum amount of resolution I could possibly use. So, at 9 feet, I would want a maximum of 200dpi, which turns out to be a 45 inch panel at 8K resolution. Since I don't think I actually "need" a screen that absolutely matches my visual system, a 60-80 inch screen will probably still look fine, and since my vision is a little better than most I assume that's a good maximum resolution for most other people as well.