Originally Posted by ILoveHighDPI
Here's where it gets frustrating seeing everyone ask for lower resolutions.
Four out of the five items on your next-gen wish list are CPU bound, dropping resolution provides no tangible benefits whatsoever.
Some people love SSAO and Bokeh DOF (the "Effects" category is the only one that is GPU bound), I have always turned those off in every game, on top of the massive framerate boost you also gain a clean aesthetic which again is just better for gameplay in shooters.
What's more, in the case of consoles it's not even a case of allocation of silicon.
The advancement of CPU technology is severely limited, there is just nothing more that can be done to improve most of the aspects of games that people are asking for (improved physics, draw distance, and item/character population). PS5/XSX have done practically everything they can do to improve the CPU.
I'd appreciate more info or sources for such metrics but I'll also try to research on my own.
I assumed that scale (more geometry larger spaces), density (like maybe more foliage that goes with larger spaces?), particle, physics, effects (like shader quality?) can be and mostly leans towards GPU power. Aren't shadows and lighting more on the GPU side of things too like Raytracing? For draw distance, doesn't the UE5 kind of help there with how much improvement its LoD is handled?
Isn't UE 5 demo a showing of how dropped resolutions can improve image quality on consoles? I was thinking this drop gives more resources for the GPU for rendering more detailed worlds and better techniques of rendering. An extreme example and not the best, I could think of is Uncharted 4 at 1080p could look to me better overall than a hypothetical 2K or 4K Uncharted 3 specially in motion.
The test didn't seem to be for CPU though but I could be wrong. I assume a heavy CPU workload would be more and smarter AI and better animation. In my opinion, the UE5 demo at 1440p/30 FPS at its "peaks" looks better than comparable scenes on current gen consoles at 4K. It's mostly on more static environment though as I think that RDR2 and Uncharted have superior animation sets but might be unfair to the demo.
Originally Posted by m4fox90
There should be no interest on anybody's part in perfecting a resolution from a decade and a half ago. It's long since time to move on from 1080p.
Originally Posted by dagget3450
Just need blurrry filters, to make it more blurry and pass it as Artistic. They are perfecting 1080p now, just pretending it's 4k for marketing purposes.
1080p/30 FPS doesn't seem to be perfected by consoles even today with a lot of games not truly able to maintain 1080p stable (some dynamic) and inconsistent 30 FPS and this is where I see a good tandem with the PC. The FPS part might be alleviated by displays with VRR though but I'm not sure.
Let consoles focus on most things other than resolutions because design of big games are only uplifted by them when generations kick in (see Unreal 4 and 5; advent of PS4 and PS5). While PC will have the amazing scalability and power through with a sharper look through resolutions, better and more stable framerates.
I do understand that devs focusing on framerate and resolutions also helps the PC though but. It could make it easier to achieve even higher res and framerate on PC with lower cost hardware but this might be an easy to confuse statement of mine. I just feel that PC can feel almost infinite in polishing up from console graphics. I have a feeling I could be very wrong on this one too as I think that even higher end pcs that have 12 cores and 2080 TIs can't easily do 4K, 120 FPS, Ultra for a game like RDR2 but I do wonder if a game like Doom Eternal can do that.
Mostly for marketing I think, but if the lower end Xbox is real, I'd assume it'd target 1440p/2K res for internal resolutions, before upscales and all because 1080p might be an easy tease against them. I do wonder if we'll be seeing 1080p upscaled to 1440p/2K/4K for the supposedly cheaper S though.