Overclock.net banner

2.2 or 2.4 Gamma Calibration

1 reading
34K views 5 replies 6 participants last post by  sparkeyrexx  
#1 ·
Hey, just wanted to get people's thoughts on whether it's better to calibrate gaming monitors to 2.2 gamma, which seems to be the industry standard, or 2.4. Reason I ask is that I was doing my bi-monthly calibration and read through DisplayCal's readme which states that for the sRGB colorspace, it's better to calibrate closer to 2.4. Excerpt below:

Also note that many color spaces are encoded with, and labelled as having a gamma of approximately 2.2 (ie. sRGB, REC 709, SMPTE 240M, Macintosh OS X 10.6), but are actually intended to be displayed on a display with a typical CRT gamma of 2.4 viewed in a darkened environment.
This is because this 2.2 gamma is a source gamma encoding in bright viewing conditions such as a television studio, while typical display viewing conditions are quite dark by comparison, and a contrast expansion of (approx.) gamma 1.1 is desirable to make the images look as intended.
So if you are displaying images encoded to the sRGB standard, or displaying video through the calibration, just setting the gamma curve to sRGB or REC 709 (respectively) is probably not what you want! What you probably want to do, is to set the gamma curve to about gamma 2.4, so that the contrast range is expanded appropriately, or alternatively use sRGB or REC 709 or a gamma of 2.2 but also specify the actual ambient viewing conditions via a light level in Lux, so that an appropriate contrast enhancement can be made during calibration. If your instrument is capable of measuring ambient light levels, then you can do so.


I tried 2.4 on my XB271hu but it crushed blacks at the lower end, but calibrating to sRGB gamma with Lux at 300 had an average of about 2.4 but blacks were around 1.92-2.2 so no crush at all. Must say that the colors do look a lot richer and more saturated but I'm not sure if I'm interpreting the DisplayCal readme correctly, that it's advisable to calibrated closer to 2.4, despite all monitor review sites doing so at avg 2.2.

Am I totally wrong that my calibration should be aiming for closer to 2.4 not 2.2?
 
#2 · (Edited)
2.4 is the "best" allround to consume content, but most lcds haven't been able to display it well due to bad contrast, leading to 2.2 being more commonly used.
On the pg27uq i use 2.4. On the pg279q i used 2.2 due to trash contrast. If you can stand 2.4 gamma without being bothered by glow/grey blacks on XB271hu, go for it (most people would prefer 2.2 gamma on a low contrast monitor, but it is subjective so decide for yourself).
 
#4 ·
2.2

I use 2.2 on all displays including my oled c7, i go to the lagom black level test and see if i can see all the blocks I'm good, on my OLED block 1 is pretty much perfect black so i cant really see it, but from block 2-20 i can see them all only if im on gamma 2.2. BT1886 or gamma 2.4 is too dark, and gamma 1.9 is washed out.
 
#5 ·
This article may be worth perusing.

https://www.cambridgeincolour.com/tutorials/gamma-correction.htm

The idea is that gamma offset/correction is an attempt to compensate for the difference between how cameras capture light and how humans perceive it. In theory if a file was created on a system corrected for 2.2 then that's probably how they intended people to view it. It's also mentioned that CRTs had a native gamma around 2.5 and at least some monitors calibrated to 2.2 out of the box. But in the end it's up to you and your eyes since the point of gamma was to correct for a disparity between computer and machine, and 2.2 is just a value the industry decided upon.
 
#6 ·
Not to necro a thread, but this comes up in searches:

The technically correct calibration is sRGB which isn't exactly 2.2 or 2.4. It has a little "fix up" in the darker levels where it's linear, then uses a 2.4 curve the rest of the way. Through most of the image the brightness lines up very closely to a 2.2 gamma, (much closer than 2.4). The difference between an sRGB image and 2.2 is really only visible in the darkest, near-black parts of the image.

However, not all content was converted for proper sRGB. If it was, and you use 2.2 (or 2.4) you'll crush blacks. If it wasn't and you use proper sRGB the dark areas look washed out.

So, you really can't win, but, best quality games that make use of sRGB by using sRGB buffers to render things in linear light will look best with a proper sRGB calibration. Most calibration software should have an sRGB option. Use it to get the best from content that intended to be perfect. Use 2.2 for an average better experience with all kinds of content, since a little black crush is not as offensive as "wash out". Some people like 2.4 better but it's more of a special effect to stretch the contrast, not a more accurate representation of the content.