Originally Posted by airisom2
Okay, so you're telling me that on the old crt monitors, they have variable pixel densities depending on which resolution is chosen, and the higher it is the more dense the pixels are? On the lcd side you're saying that mostly all of the monitors have a static pixel density that is common amongst all of them, give or take some, and that no matter what resolution it's at the density wont change? With that said, does that mean that the higher the pixel density is, the less AA you need to use? What I was basing my previous post on is that a 1920x1200 30in monitor may need more AA than a 30in 2560x1600 monitor because it has less pixels in the area that they are contained in. Am I correct? This is new to me, and I'm trying to understand it.
well, I don't know of any 30" monitors @ 1920x1200, but yes, if we take a 27" monitor @ 1920x1080 and compare it to a 27" monitor @ 2560x1440, the latter will need less AA. Of course the latter will also be more GPU intensive because its a higher resolution, just like cranking up the AA to make 1920x1080 appear to be as smooth as jaggie free as the 2560x1440 assuming the same screen size would be GPU intensive.
But, for instance, if we compare a 23-24" monitor @ 1920x1200 to a 2560x1600 monitor @ 30", we could argue that each monitor would need AA equally as much.
Ideally we'd get to a pixel density in the 300+ per inch range (such as what we see on smartphones and pads or high end magazine print) to where AA would be pretty much irrelevant. Granted, that would mean 9+ times the resolution per inch of screen space which would destroy our current hardware