Overclock.net banner
1 - 14 of 14 Posts

·
Registered
Joined
·
2,556 Posts
Discussion Starter · #1 ·
Why people continue to bash HDR400 monitors?

It's clear that HDR400 monitors aren't that good in doing HDR due to the low brightness, generally no FALD and so on...
... but I think that every Wide Gamut monitors should now support HDR because only with the HDR support
wide gamut monitors can "manage the color space" inside the HDR contents.

Most of the games/films looks too much oversaturated on Wide Gamut panels, this is completely solved by using HDR on both monitor and content.
Basically using HDR on contents like film or games is like enabling "color management".

So for me, HDR support is really important on every wide gamut monitors, even if the monitor is HDR400 only.
 

·
Registered
Joined
·
3,859 Posts
Why people continue to bash HDR400 monitors?

It's clear that HDR400 monitors aren't that good in doing HDR due to the low brightness, generally no FALD and so on...
... but I think that every Wide Gamut monitors should now support HDR because only with the HDR support
wide gamut monitors can "manage the color space" inside the HDR contents.

Most of the games/films looks too much oversaturated on Wide Gamut panels, this is completely solved by using HDR on both monitor and content.
Basically using HDR on contents like film or games is like enabling "color management".

So for me, HDR support is really important on every wide gamut monitors, even if the monitor is HDR400 only.

Because a properly calibrated monitor is around 180 cd/m(2). There's no need for such a bright monitor of 400 cd/m. Calibrated monitors are accurate the way they are. We don't need such extreme brightness levels. We are not operating our computers on the surface of the Sun.

180 cd/m(2)
Gamma 2.2

this is all humans need.


HDR ruins calibration, as well as the gimmick of reducing blue light to make it "easier" on the eye. Calibration is about accurate colors and white balance, not colors that "pop out" at you. Monitor calibrators exists for a reason.

What people forget is that we have something in our eyes called an pupil which adjusts to ambient light. In a darkened room a dimly lit TV can appear just as bright as an HDR 600 television on the showroom floor at Bestbuy. It's all about the ambient lighting. This is why movie theaters are dark; so your eyes adjust to the relatively dim bulb of the projector without having the screen lit up by ambient light. Of course, ambient light not not illuminate a television screen, but at what point will TVs and monitors become TOO bright to comfortably watch in the dark?
 

Attachments

·
Vermin Supreme 2020
Joined
·
25,768 Posts
^ one phone up in the mountains is like sitting on the couch watching a 70 inch.

also, glad to see you're alive, AWeir.

HDR is misunderstood because of poor support on the PC side, and too many SKUs/names.

HDR10, HDR400, HDR600, HDR1000, HDR9000

like seriously?

I ended up getting dell's newest wide screen @ almost twice the price ($800 on black friday, MSRP 1199, selling for 999) over ASUS TUF's cheap wide screen with HDR10 simply because of ASUS's recent QC issues, and not having a clue how HDR actually works.
 

·
Registered
Joined
·
2,556 Posts
Discussion Starter · #4 ·
Because a properly calibrated monitor is around 180 cd/m(2). There's no need for such a bright monitor of 400 cd/m. Calibrated monitors are accurate the way they are. We don't need such extreme brightness levels. We are not operating our computers on the surface of the Sun.

180 cd/m(2)
Gamma 2.2

this is all humans need.


HDR ruins calibration, as well as the gimmick of reducing blue light to make it "easier" on the eye. Calibration is about accurate colors and white balance, not colors that "pop out" at you. Monitor calibrators exists for a reason.

What people forget is that we have something in our eyes called an pupil which adjusts to ambient light. In a darkened room a dimly lit TV can appear just as bright as an HDR 600 television on the showroom floor at Bestbuy. It's all about the ambient lighting. This is why movie theaters are dark; so your eyes adjust to the relatively dim bulb of the projector without having the screen lit up by ambient light. Of course, ambient light not not illuminate a television screen, but at what point will TVs and monitors become TOO bright to comfortably watch in the dark?
but did you read what I wrote?
we are not talking about contrast/brightness but about color management, a management that is a must have on wide gamut monitors to avoid the oversaturation problems.
 

·
Registered
Joined
·
1,264 Posts
but did you read what I wrote?
we are not talking about contrast/brightness but about color management, a management that is a must have on wide gamut monitors to avoid the oversaturation problems.
Then HDR is not the solution. Bash Microsoft/Developers for not makin color managed applications and Monitor makers for not providin SRGB mode/lockin out calibration if you use it.
 

·
Registered
Joined
·
2,048 Posts
but did you read what I wrote?
we are not talking about contrast/brightness but about color management, a management that is a must have on wide gamut monitors to avoid the oversaturation problems.
Why not get a colorimeter then?

HDR =/= accurate colors and your viewing room/distance/angle affects your calibration so the 'default' HDR settings may not be correct for me even if they are for you.
 

·
Registered
Joined
·
2,556 Posts
Discussion Starter · #7 ·
Then HDR is not the solution. Bash Microsoft/Developers for not makin color managed applications and Monitor makers for not providin SRGB mode/lockin out calibration if you use it.
HDR is the only solution we have to color manage games and movies now.
unfortunantly devs should do color management even outside HDR but it's not what they does
 

·
Registered
Joined
·
45 Posts
HDR doesn't provide any color management, if anything it makes it even worse on any local dimmed display by completely messing up the gamma curve in order to achieve an eye blinding brightness level.

What's worse is that a lot of HDR monitors - like my 49CRG90 - disable color and gamma options in the OSD once HDR is enabled so calibration is out of the window. Sure you can calibrate on the Windows end with a colorimeter but how many games respect ICC profiles especially in HDR ?

Some wide gamut displays - like my 49CRG90 - disable every setting other than brightness in SRGB emulation mode !! sRGB emulation is supposed to correct oversaturated colors for sRGB content but if factory calibration is slightly off you are stuck calibrating the monitor in "Custom" picture mode yourself but at least you can change gamma and white point on the monitor end (in custom mode) so the situation is not as hopeless as it is in HDR. Mind you a lot of wide gamut displays don't even allow changing brightness in sRGB mode (looking at you Philips) ?!?!

Honestly I am starting to believe that any HDR outside of OLED is just garbage, regardless of max brightness or number of dimming zones. Unfortunately using an OLED as a desktop monitor for anything outside of gaming is not an option.
 

·
Registered
Joined
·
2,556 Posts
Discussion Starter · #9 · (Edited)
HDR doesn't provide any color management, if anything it makes it even worse on any local dimmed display by completely messing up the gamma curve in order to achieve an eye blinding brightness level.

What's worse is that a lot of HDR monitors - like my 49CRG90 - disable color and gamma options in the OSD once HDR is enabled so calibration is out of the window. Sure you can calibrate on the Windows end with a colorimeter but how many games respect ICC profiles especially in HDR ?

Some wide gamut displays - like my 49CRG90 - disable every setting other than brightness in SRGB emulation mode !! sRGB emulation is supposed to correct oversaturated colors for sRGB content but if factory calibration is slightly off you are stuck calibrating the monitor in "Custom" picture mode yourself but at least you can change gamma and white point on the monitor end (in custom mode) so the situation is not as hopeless as it is in HDR. Mind you a lot of wide gamut displays don't even allow changing brightness in sRGB mode (looking at you Philips) ?!?!

Honestly I am starting to believe that any HDR outside of OLED is just garbage, regardless of max brightness or number of dimming zones. Unfortunately using an OLED as a desktop monitor for anything outside of gaming is not an option.
I don't agree. HDR contents target DCI-P3 color space so you don't need "to correct" colours on your wide gamut monitors to don't look oversaturated.
Pretty every HDR monitors shows oversaturated colors on non color managed software but not when HDR is enabled on both content and monitor (and obviously the software who reproduce the content and the OS)

For me HDR is a must have on Wide Gamut monitors, don't care if OLED, HDR400 or HDR1000
 

·
Registered
Joined
·
45 Posts
I don't agree. HDR contents target DCI-P3 color space so you don't need "to correct" colours on your wide gamut monitors to don't look oversaturated.
Pretty every HDR monitors shows oversaturated colors on non color managed software but not when HDR is enabled on both content and monitor (and obviously the software who reproduce the content and the OS)

For me HDR is a must have on Wide Gamut monitors, don't care if OLED, HDR400 or HDR1000
I am not specifically trying to correct colors on HDR (I know HDR targets DCI-P3) but gamma curve. Good luck trying to find a local dimmed display (FALD or not) with proper gamma curve in HDR. HDR doesn't have an oversaturated color problem, the problem is the opposite : it has a washed out color problem. Washed out colors are due to very high gamma values.

Oversaturated colors in wide gamut displays are a problem when NOT in HDR mode and trying to view sRGB content. Now in an ideal world every app we use would be color managed but in the real world only professional graphics and editing software is. Thus we have to hope for a properly calibrated SRGB emulation mode on the monitor itself. That is hit & miss. Not every monitor is properly calibrated and many wide gamut monitors don't even have an sRGB emulation mode (even high end models such as the LG 34gk950g).
 

·
Registered
Joined
·
2,556 Posts
Discussion Starter · #11 ·
I am not specifically trying to correct colors on HDR (I know HDR targets DCI-P3) but gamma curve. Good luck trying to find a local dimmed display (FALD or not) with proper gamma curve in HDR. HDR doesn't have an oversaturated color problem, the problem is the opposite : it has a washed out color problem. Washed out colors are due to very high gamma values.

Oversaturated colors in wide gamut displays are a problem when NOT in HDR mode and trying to view sRGB content. Now in an ideal world every app we use would be color managed but in the real world only professional graphics and editing software is. Thus we have to hope for a properly calibrated SRGB emulation mode on the monitor itself. That is hit & miss. Not every monitor is properly calibrated and many wide gamut monitors don't even have an sRGB emulation mode (even high end models such as the LG 34gk950g).
Sincerely I'm not experiencing what you are talking about.
HDR contents looks perfect on my HDR400 Acer Nitro XV273, games looks perfect, movies looks perfect.

If I don't enable HDR, those contents shows very oversaturated images, like red faces on characters and so on.
With HDR on colors looks more natural and correct as much as I'm inside a color managed environment with a color aware content.
 

·
Overclocker
Joined
·
11,688 Posts
Some monitors can be truly oversaturated when too wide gamut and having the game/application output HDR could help resolve that if their tonemapping is any better than their SDR tonemapping, for both of which they often offer no user adjustment and on top of it you have to deal with the HDR tonemapping present in monitor itself which often is quite lackluster and clips all highlights above it's max nits.

sRGB on most monitors is a total joke and only pro monitors that have calibration from factory or allow user calibration actually have any sensible quality for such color profiles.

People bash on it because they want 1M:1 contrast with 1000 nits finally and move away from this single layer LCD junk we've been stuck on for decades. Thus marketing something as "HDR" when it has 1000:1 contrast and 350 nits is kind of a joke and all it means is that it accepts HDR input while it's output often is only SDR wider gamut.
 

·
Registered
Joined
·
106 Posts
Some monitors can be truly oversaturated when too wide gamut and having the game/application output HDR could help resolve that if their tonemapping is any better than their SDR tonemapping, for both of which they often offer no user adjustment and on top of it you have to deal with the HDR tonemapping present in monitor itself which often is quite lackluster and clips all highlights above it's max nits.

sRGB on most monitors is a total joke and only pro monitors that have calibration from factory or allow user calibration actually have any sensible quality for such color profiles.

People bash on it because they want 1M:1 contrast with 1000 nits finally and move away from this single layer LCD junk we've been stuck on for decades. Thus marketing something as "HDR" when it has 1000:1 contrast and 350 nits is kind of a joke and all it means is that it accepts HDR input while it's output often is only SDR wider gamut.
A solution would be to make PC monitors that don't suck donkey balls. But i guess that is a pipe dream considering computer monitors have sucked for this past decade. By the time a good one comes out i'll be dead.
 
1 - 14 of 14 Posts
Top