I'm someone who really hate aliasing which was my main motivation to move to 4k in the first place and to me, 4K seems to be enough resolution. The shift from 1080p to 4k is huge and in no small part because its enough to basically eliminate aliasing issues from games entirely with no AA or low AA. The shift from 4k to 8k would be massively less noticeable and the benifit in fidelity vs the GPU render cost just isn't there like it is for 4k, and it never will be. 8k for gaming will not be as common in ten years as 4k gaming is now because going from 4k to 8k does not solve the aliasing problem, since it's already solved at 4k. 8k gaming will not be a thing until GPU power and screen manufacturing costs for 8k are so low that there isn't much of a reason not to do it.
for all of those reasons and more, 4k is going to be the standard for a long, long time. 1080p is too low to eliminate artifacts but 4k isn't, and 8k is excessive and wasteful with terrible cost benifit ratio.
8k is a novelty and it's going to remain that was for way longer than 4k did. Most movies today still are not even filmed in 4k. It wasn't even until the last few years that movie theaters upgraded to 4k projectors. If you saw the first avengers movie in theaters, it was probably at a much lower resolution than 4k at the time and the newest avengers movies are not even truly 4k in most of their shots for a variety of reasons.
That said, you could use upscaling for PC games if you happened to have an 8k monitor and you wanted to render a game at 4k. DLSS probably wouldn't work but some of the other upscalers probably would work just fine today. But seeing that there is basically going to be zero 8k content for a long long time, I don't see it becoming even common enough for that situation to arise often enough to even be worth mentioning 5+ years from now. I would bet that PS6 won't even have a strong 8k focus. The need just wont be there.
i7 7700kK @4.2ghz
16GB DDR4 3200mhz
GeForce 1080 Ti
Last edited by UltraMega; 07-31-2019 at 11:12 PM.