Contrast ratio is not consistent across brightness ranges.
For example, I would say the black level is pretty darn good at 1% brightness setting.....but the rest of the screen is soo dark, that it doesn't really matter if it still had a 1000:1 contrast ratio, it doesn't look as good......and I would say that probably there is no longer a 1000:1 contrast ratio at that brightness level.....and contrast ratio is a function of brightness.
Now also it's important to understand what we are trying to replicate. For example, a lizard on a tree branch during day, will have a certain luminence to it, which can be more realistically represented with higher image brightness. Colors aren't the only importance, brightness of colors is also of importance for 'immersion'. Looking at things on low brightness settings might be more akin to looking at the world with only sunglasses on. I wouldn't exactly call that accurate or desirable picture quality.
HDR screens for example, actually have relatively bad black levels.....it's just that they're soo bright, that the contrast between the brightest and darkest parts, is greater. Contrast is in essence, a concept of relativity. These screens have well over 1000-2000 nits of brightness, at least 3-4 times your average screens. Are they better at blocking light? Not really. They might have some individual back lights for localizing dimming, but even that's a hit and a miss and highly situational. So even if the blacks bleed twice as much light, the brightest areas will still be more than twice as bright still, maybe even 5 times as bright, which translates into contrast.
Or to put it simply, one can just look at the very definition of contrast ratio:
"The contrast ratio is a property of a display system, defined as the ratio of the luminance of the brightest color (white) to that of the darkest color (black) that the system is capable of producing."
Reading that, one can immediately see that the brightness difference really isn't that great at low brightness, therefore the contrast also isn't there.
So you actually need brightness, to not only increase contrast, but to also be able to see it relative to what's being represented.....and what's generally being represented is the real world.
While you may prefer a low brightness, that is actually not getting the best picture quality out of your monitor's specs, which is really what I was discussing, especially since one would expect such high/odd settings to go completely outside of a monitors gamma and wash out colors.
I know for the most of the life of this monitor, I never bothered tinkering with it much beyond a few ticks here and there with the contrast, a few ticks here and there with the color temp to make the picture warmer or get rid of some green hue casting, etc. Even when I did fiddle with these settings, the relationship between contrast and color temp wasn't immediately apparent, so you'd change one, it'd start to wash out or give an undesirable color tint to the picture, then you'd back off. Even sites like TFT central suggested a few ticks here and there to calibrate the monitor......and the picture always looks ok, maybe acceptable, but nothing really approaching these new settings.
Now maybe (maybe not), it's possible, there's some Adobe RGB color space inaccuracy being introduced......but the picture undoubtedly looks better. I don't do printing work, so it's not an issue for me either way. But I see it more as untapping a bit of potential in the panel that the manufacturer didn't exactly include.....you know, kind of like how those GPU's used to have extra locked cores that a simple bios update would unlock and basically turn a cheaper card into a more expensive one. And that makes sense, it's safe to say these panels do have more capabilities than they were shipped with. I mean the first ones shipped with 100Hz ULMB, the next ones shipped with 120Hz ULMB, and later ones shipped with 165Hz rather than 144hz......all the same panel/hardware.
Edited by AMDATI - 4/30/17 at 12:26pm