Originally Posted by Spartan F8
If you use CRU and modify the monitor driver files you can change the native resolution of your display and it is not downsampling. Whether this is a higher resolution or a lower resolution you can change it. Just the same as downscaling a CPU they may have done this with some monitors and the panel is capable of a certain resolution but the majority(or minority) or the panels could not handle the panel resolution at the standard 60hz refresh rate therefor they may have lowered the resolution to accommodate. This has actually been seen with TVs all the time. For example many westinghouse TV are CHEAP but the three that i have owned would never display the native 1080p resolution at 60hz but if you set the resolution to something like 1820x980 it would. So in this since they underclocked the refresh of a bad screen to keep there marketing FHD term. On the same turn a PC monitor may have its resolution reduced to reach 60hz.
Another good example is we have a Dell monitor at my office that runs at 1600x900@60hz but if you set it in CRU to 1920x1080@47hz it will work but gets a noticeable sharpness issue in the bottom left hand portion of the screen. With using CRU and with scaling set to the monitor there is no way this monitor is downsampling it is running 1080p and the panel is clearly capable of it. However with the defect that you see at that resolution i know why they would set it back to 1600x900 instead of a whole bin of monitors possibly going to waste.
I am not arguing the fact that yes after a panel is made the MAX resolution is the max resolution but i am saying the manufacturer may not have set it to that maximum due to sometimes obvious problems just like any other computer component. I mean every 1080p monitor may be 1440p and every 1440p monitor could be higher.
I think we have a confusion in terms here.
The way you're using "native resolution" is not the same as what I say is native resolution.
Here is what I understand:
Native resolution only makes sense in the context of the LCD panel itself - a given panel has a given native resolution, and cannot be changed. The panel is the panel. You cannot change native resolution via software, since native resolution is a hardware
concept, and applies only to the LCD panel, not the monitor.
For LCD's, no matter what kind of signal you feed it, it will always display that resolution; for a 1080p panel, it will display 1080p no matter what.
Let's say you have a 1440p monitor.
You feed it a 1440p signal - things are at size 100%, it displays a 1440p image.
You feed it a 1080p signal - things are at size 177% (larger, less desktop space), it displays a 1440p image, upsampled from the video signal. This upsampling is done in the monitor scalar.
You feed it a 2880p signal - things are at size 25% (smaller, more desktop space), it displays a 1440p image, downsampled from the video signal. This downsampling is done in the monitor scalar.
There's another option, with video card scaling; this, the video card renders at 720p/1080p/2880p, then upsamples/downsamples it into a 1440p signal, then sends the monitor a 1440p signal which is displayed at 1440p.
Whatever the case may be, the native resolution never changes. Your Dell may be processing a 1080p signal @47hz, but it's native resolution is still 1600x900. That's how many discrete pixels are on the panel. I do not see by what process a 1080p panel could possibly be sold as a 900p panel; they aren't manufactured that way, they aren't binned that way and they definitely can't fail in the same way that CPUs do that allow for lower clocked CPUs.
I can see how what you say applies to CRT monitors, which have a "max
" resolution as opposed to a native resolution... but LCD's don't work that way.