Originally Posted by bigjdubb
Originally Posted by ILoveHighDPI
Maybe your version of "photorealism" comes off a Polaroid, but the photography industry seems to have settled on 20 Megapixels as a standard for most applications, that's approximately 6K resolution (if you squish the same number of pixels into a 16:9 aspect ratio).
I'm not suggesting that 5K120hz immediately become that standard target resolution for games either, but with every new generation of hardware old software becomes easily adapted to a high resolution and framerate, and with every year that goes by we have more and more amazing games that fit into that category.
The practical reality is that the vast majoriy of every gameing library will soon be playable in 5K-8K resolution, and the lack of high resolution monitors means that the potential of our new hardware is being wasted when applied to old games.
Well no game in existence will look photo-realistic at 6k, 8k, or even 100k. Resolution has little to do with photo realism.
So, is a cluster of four pixels photorealistic?
Resolution has to be part of the equation somewhere.
Traditionally the idea of achieving photorealism wasn't much of a question, technology was so far from ideal that no-one really wondered about the specifics of that goal.
Then when "HD" came along the people pushing for new technology used whatever means necessary to convince bigwigs and mass media that this was the greatest thing that could ever happen, and invest in changing standards.
The marketing was so effective that people now reject new technology because it was drilled into everyone's minds that "HD" is the best thing that could ever possibly exist.
The technical capabilities of new standards isn't a problem, changing established standards is just a huge problem economically.
Realistically, both should exist simultaneously.
1080p graphics are fine for some things, I would still argue that higher resolution always makes everything look better, but the economic reality is 4K means you need 4X as much processing power to render an image.
Some applications should make the sacrifice for the absolute highest resolution and framerate possible, while other applications are mostly unaffected and would be better served not going that route.
For example, I think DOOM is best played at 4K120hz, if the biggest attraction to that game is the challenge presented, graphics are practically meaningless, I just want the most precise "pointing" mechanism that I can possibly get and the only two metrics that really matter are framerate and resolution. You could probably change all the enemies to sprites and it would barely affect gameplay (come to think of it, that would be an awesome bonus mode if they could turn all the enemies into sprites).
id Software correctly prioritised 1080p and 60fps as critical points in making DOOM the best shooter they could possibly produce for consoles. On PC we can have more, and for this kind of game there really isn't any good reason not to.
Oppositely, something like Resident Evil probably doesn't "need" any more resolution or framerate than currently available on consoles, because the gameplay design is all about using ambience to create suspense, and the combat mechanics are very methodical.
The point is that people should stop trying to apply graphical standards universally, as though everything is best played at a given specification, and instead try to apply technology where it is relevant to a given game.