Originally Posted by DNMock
The only problem with that is that AA itself is on it's last legs since the need for AA goes down as resolution increases. For me at least at 4k I see either little to no improvement or a decrease in quality with the image getting fuzzier.
There definitely are a lot of crappy AA methods out there.
Really you need about 300 Pixels Per Degree before Jaggies start to blur out, at a 27" screen size that still means 8K or higher resolution for the average person at the average view distance.
Yes it is a scale of diminishing returns, basically your cost vs. sharpness ratio is linear up to 1080p (60PPD) and then the relative cost of sharper imagery starts to rise. I'd say the cost to benefit of 4K is still practically the same as 1080p, but going to 8K is definitely a sharp drop in returns.
8K still isn't "No Returns" for your effort but it's getting close enough that I doubt anyone other than IMAX should legitimately consider going ahead to something like 16K.
The good news is that Variable Rate Shading should start to give us localized Super Sampling everywhere that it matters. Ideally it would work practically the same as MSAA, just now compatible in modern game engines.
If it's possible to send the full pixel mapping to the display with VRS, then your 4K+AA becomes Native 8K imagery.
If in the near future we see mass adoption of variable resolution to sharpen high contrast lines it should mean that 8K could be utilized at zero added cost.