Why people continue to bash HDR400 monitors?
It's clear that HDR400 monitors aren't that good in doing HDR due to the low brightness, generally no FALD and so on...
... but I think that every Wide Gamut monitors should now support HDR because only with the HDR support
wide gamut monitors can "manage the color space" inside the HDR contents.
Most of the games/films looks too much oversaturated on Wide Gamut panels, this is completely solved by using HDR on both monitor and content.
Basically using HDR on contents like film or games is like enabling "color management".
So for me, HDR support is really important on every wide gamut monitors, even if the monitor is HDR400 only.
Because a properly calibrated monitor is around 180 cd/m(2). There's no need for such a bright monitor of 400 cd/m. Calibrated monitors are accurate the way they are. We don't need such extreme brightness levels. We are not operating our computers on the surface of the Sun.
this is all humans need.
HDR ruins calibration, as well as the gimmick of reducing blue light to make it "easier" on the eye. Calibration is about accurate colors and white balance, not colors that "pop out" at you. Monitor calibrators exists for a reason.
What people forget is that we have something in our eyes called an pupil which adjusts to ambient light. In a darkened room a dimly lit TV can appear just as bright as an HDR 600 television on the showroom floor at Bestbuy. It's all about the ambient lighting. This is why movie theaters are dark; so your eyes adjust to the relatively dim bulb of the projector without having the screen lit up by ambient light. Of course, ambient light not not illuminate a television screen, but at what point will TVs and monitors become TOO bright to comfortably watch in the dark?
132.8 KB Views: 86