Hey, just wanted to get people's thoughts on whether it's better to calibrate gaming monitors to 2.2 gamma, which seems to be the industry standard, or 2.4. Reason I ask is that I was doing my bi-monthly calibration and read through DisplayCal's readme which states that for the sRGB colorspace, it's better to calibrate closer to 2.4. Excerpt below:
Also note that many color spaces are encoded with, and labelled as having a gamma of approximately 2.2 (ie. sRGB, REC 709, SMPTE 240M, Macintosh OS X 10.6), but are actually intended to be displayed on a display with a typical CRT gamma of 2.4 viewed in a darkened environment.
This is because this 2.2 gamma is a source gamma encoding in bright viewing conditions such as a television studio, while typical display viewing conditions are quite dark by comparison, and a contrast expansion of (approx.) gamma 1.1 is desirable to make the images look as intended.
So if you are displaying images encoded to the sRGB standard, or displaying video through the calibration, just setting the gamma curve to sRGB or REC 709 (respectively) is probably not what you want! What you probably want to do, is to set the gamma curve to about gamma 2.4, so that the contrast range is expanded appropriately, or alternatively use sRGB or REC 709 or a gamma of 2.2 but also specify the actual ambient viewing conditions via a light level in Lux, so that an appropriate contrast enhancement can be made during calibration. If your instrument is capable of measuring ambient light levels, then you can do so.
I tried 2.4 on my XB271hu but it crushed blacks at the lower end, but calibrating to sRGB gamma with Lux at 300 had an average of about 2.4 but blacks were around 1.92-2.2 so no crush at all. Must say that the colors do look a lot richer and more saturated but I'm not sure if I'm interpreting the DisplayCal readme correctly, that it's advisable to calibrated closer to 2.4, despite all monitor review sites doing so at avg 2.2.
Am I totally wrong that my calibration should be aiming for closer to 2.4 not 2.2?
Also note that many color spaces are encoded with, and labelled as having a gamma of approximately 2.2 (ie. sRGB, REC 709, SMPTE 240M, Macintosh OS X 10.6), but are actually intended to be displayed on a display with a typical CRT gamma of 2.4 viewed in a darkened environment.
This is because this 2.2 gamma is a source gamma encoding in bright viewing conditions such as a television studio, while typical display viewing conditions are quite dark by comparison, and a contrast expansion of (approx.) gamma 1.1 is desirable to make the images look as intended.
So if you are displaying images encoded to the sRGB standard, or displaying video through the calibration, just setting the gamma curve to sRGB or REC 709 (respectively) is probably not what you want! What you probably want to do, is to set the gamma curve to about gamma 2.4, so that the contrast range is expanded appropriately, or alternatively use sRGB or REC 709 or a gamma of 2.2 but also specify the actual ambient viewing conditions via a light level in Lux, so that an appropriate contrast enhancement can be made during calibration. If your instrument is capable of measuring ambient light levels, then you can do so.
I tried 2.4 on my XB271hu but it crushed blacks at the lower end, but calibrating to sRGB gamma with Lux at 300 had an average of about 2.4 but blacks were around 1.92-2.2 so no crush at all. Must say that the colors do look a lot richer and more saturated but I'm not sure if I'm interpreting the DisplayCal readme correctly, that it's advisable to calibrated closer to 2.4, despite all monitor review sites doing so at avg 2.2.
Am I totally wrong that my calibration should be aiming for closer to 2.4 not 2.2?