sRGB piece-wise EOTF vs pure gamma

Thanks @jack.holm

I wish the sRGB standard would have been written with those clear wording.
Nowadays reality is slightly different; in the wild and many displays are closer calibrated to a pure power law in their sRGB mode.
So the term sRGB became quite ambiguous.

But at least the community becomes aware of that.

What I don’t understand though:
sRGB is written with a compatibility to “rec709 video” in mind.
Meaning that you should be able to stream an unaltered video signal to an sRGB monitor and it should just work (EOTF mismatch compensate the surround change etc).

To my eye the compound function is not suitable for that task where the pure 2.2 power law EOTFs works quite well, assuming that the video content was generated on a monitor with a pure power law 2.4 EOTF.

Any thoughts?

2 Likes

Interesting thing is (correct me if I’m wrong), display with 0.1 nit black (typical for ips with 100 nit white) calibrated to 1886 will give the exact match with sRGB curve :slight_smile:
So basically any colorist, who are working on 1886 calibrated IPS display, see the image with brighter shadows and overall average “gamma” closer to 2.2 instead of 2.4. this pushes the colorist to add more contrast, than it’s needed, if displayed on 2.4 power law EOTF display. And even with 2000:1 contrast ratio display (I’ve never seen an IPS display with this high contrast ratio without zone dimming), the shadows are still too bright and overall contrast is much lower than with 2.4 power law EOTF. And 2000:1 contrast ratio is still acceptible for Grade A display.
Then consumers watch this on their 0 nit black OLEDs with 1886 EOTF that gives basically 2.4 power law EOTF and see the image that is too contrasty.
I’ve seen a lot of wars about 1886 vs 2.4 power law for <2000:1 contrast ratio displays as well as srgb vs 2.2 power law. So I’m curious what do people in this community think about it.

to clarify BT 1886 is never used with the black offset in the inverse EOTF nor in the forward EOTF, afaik.
So most pipelines I know use in fact a pure power law 2.4 “encoding” on the wire when they say 1886.
The black offset in 1886 is not optimal.
I edited the post above to be more clear, thanks for this

Thanks! Basically my question was about how it is viewed (not encoded). I mean, it looks like this standard makes everyone see image different, in a bad way, depending on their black level.

which standard do you mean sRGB or ITU.BT 1886 or both :slight_smile:

There is nothing you can do to make a 1000:1 display match a 5000:1 display.
You can choose to make the contrast appear the same but only by scarifying shadow detail.
Or you keep the shadow detail and loose a bit of contrast.
I prefer the later and it comes for free in a relative-black system. Flare is actually a nice roll off function.

1 Like

I was taking about 1886. So, you prefer to see more shadows details instead of the overall contrast. Interesting, thanks! I grade with 2.4 power law, but in the end of a grading shift I switch to 1886 and make a trim pass to check the shadows if they are consistent and aren’t crushed.

I am afraid I am having trouble keeping up with the new posts, much less going back to read the older ones, but I will provide a few more comments from my perspective.

I don’t think BT.1886 graded content will be “perfect” on an sRGB display in the sRGB viewing environment, just reasonable. The straight line part of the sRGB EOTF was to avoid extreme slope, which caused problems in some color management systems. It also partly addresses the difference between the expected black point of 0.1 nit for 1886 and 0.2 nit for sRGB (with more veiling glare because of the higher ambient).

I tried for 25 years to get agreement to clarify the sRGB standard, but was always met with strong resistance. At this point it is out of my hands.

While it is true that many people assume a zero display black with BT.1886, it is only recently that this can be approximately true. Otherwise this assumption causes problems with clipping or elevation of black values when going between black-relative colorimetry and regular colorimetry.

Yes, what happened is most people sort of de-facto agreed to use black relative colorimetry, i.e. the black signal level is mapped to the display black and the 2.4 gamma is used. However, this EOTF only matches the BT1886 EOTF when the display black point is zero. The 1886 EOTF more accurately fits the mastering display measurements on which it is based. One can argue that the black-relative fixed 2.4 gamma EOTF is better at mapping the black than the 1886 EOTF, but it did not match what displays at the time were actually doing.

If you don’t like what a standard prescribes, you are free to do something different, maybe even something better, but in that case it is better to say what you are doing than to claim you are following a standard when you are not, even if most people are doing the same. Doing that causes us to have to spend time on these discussions, and in some cases inhibits progress.

5 Likes

I could not agree more. When we try to match SDR monitors with HDR monitors, we don’t put the formers in sRGB standard mode as this is not what the average user does. Instead, we leave them in demo floor maximum brightness setting which gives us an average nits level between 200 and 300 for SDR graphics white when converting to PQ (depending on the monitor we’re comparing to). Maximum brightness setting also compensate for sunlight glare at certain times of day so it has a practical use in addition to matching what the average user is doing :slight_smile:

(emphasis added)

On this point briefly, I’m not sure there are many, or any, monitors today set at 80 nits. Mobile devices have screens up to 1200 nits. Desktop and other monitors are commonly set at 160 to 240 nits or more. At least this part of the sRGB standard is obsolete.

And to add, when developed and adopted, sRGB applied to monitors connected to Windows machines. At that time, SGI was using a 1.4 gamma monitor, and Apple was using 1.8 and would not switch to 2.2 for another ten years, about 2008. Windows was not using color management, and math co-processors were just starting to be a standard item.

The piecewise sRGB TRC is kind of like having to support NTSC pulldown in HD for years even though NTSC was long gone. In other words, how relevant is it? Certainly to unwind images encoded with the piecewise, sure. And to re-encode the image data for compatibility.

My reading of the standard is that the piecewise to linear transform is there for image processing purposes, to invert the encoding transform. Otherwise it seems clear to me that the standard states the output is to a reference monitor with a nominal 2.2 pure gamma. It also seems clear the standard states how the image data should be encoded in a file or data stream, which is using the piecewise.

Importing images is one of the few places I use sRGB. I work almost exclusively in a linearized space, or a pure gamma working space in some cases. The only time I’m exporting to sRGB is for images for web content.

Monitors in the Wild

Every monitor I’ve measured uses a pure gamma curve. The IEC standard specifies a reference monitor with a 2.2 gamma curve as shown above (and in the intro as well). Manufacturers use the pure gamma curve. The NEC pa271ws we use come with their internal LUTs configured with a 2.2 gamma, though the LUTs are programmable and can be set by the user to any curve, including the piecewise, that is not the default. The other monitors here, Samsung wide gamut, and Viewsonic, among others are a pure gamma curve. All hardware calibrated and profiled.

In a research project I’m working on regarding contrast for web content, we’ve measured several displays and devices, and sending them a test image that was created with a 2.2 simple gamma, or sending CSS colors, and not using any color management, we haven’t come across any that seem to be using the piecewise, though we do have more to test.

The IDMS standard for monitors describes measuring the gamma of monitors, and makes no mention that I found regarding adherence to using the piecewise TRC in the monitor itself.
Unlink: www sid org/Standards/ICDM#8271483-idms-download

Workflows

The only use we have for the sRGB piecewise TRC is when importing an sRGB image, or exporting for web. We don’t use it beyond that. For print/pre-press we’ll be working in 16bit ProPhoto, AdobeRGB, and/or CMYK. For film work, our interchange format is linear float EXR, & delivery as EXR, 10bit DPX, or 16 bit TIFF, the latter two with some form of a log curve.

While we don’t do much web video, video is not usually color managed, and more often just streams to the monitor which is as mentioned nominally a pure gamma curve, unless calibrated and profiled to have something else, a case that is quite variable, one of the variables being whether the user agent does anything for video playback — on most mobile devices there is no color management, and and everything is just squirted to the display, which is assumed to be sRGB… but the curve?

As new color spaces are emerging, including for future web content, I personally had hoped to see an end to piecewise TRCs, which today are about as useful as NTSC pulldown. Then Apple made the odd move to use the sRGB TRC for DisplayP3. (?!?!)

TL;DR

Recent Apple products aside, display standards and displays we’ve tested are using a pure gamma, and this is echoed in standards for monitors not to mention the IEC standard itself. Color manglement may affect that.

4 Likes

Just did a factory reset on the Eizo CG247X I’ve got in front of me. By default it uses the piecewise sRGB EOTF (for sRGB modes). Same was true of the Dreamcolors at my previous employer.

I see both 2.2 and piecewise in the wild, sometimes from the same manufacturer. It’s super frustrating.

3 Likes

Yeah, wanted to mention Eizo, we calibrate ours using the sRGB piecewise.

1 Like

Yes, I do know that EIZO comes with the piecewise… and their competitor NEC comes set with simple gamma, though the Spectraview software does offer the piecewise (both the NEC app and the one they licensed from BasICColor).

And any high end monitor with internal programmable LUTs of course you can do what you will…

I think my point though is here we set all our monitors to a simple curve because:

  1. We work in linear space almost exclusively.
  2. We deliver in either some flavor of log or exr linear.

The only exception is delivering stills for web deployment, but that’s when it does not really matter, because… MacOS color management and/or A Dough Bee color manglement will end up displaying the same since everything gets “managed” based on the hardware calibrated profile to come out quite literally the same no matter the TRC.

So… if everyone had functional color management, we could quite literally discard the piecewise TRC, and just make sure all images are tagged with whatever space you felt like exploring on the day.

But color management is a problem for mobile devices due to overhead and powerdrain… so that dream is still a few years off…

…But here near the end of 2021, at least we are finally making movies at 24 flat and not 23.976…

Interesting that when I read the LUT our of my EIZO set to sRGB I get the same curve as when it is set to 2.2, go figure I know we don’t program the monitor to do anything LUT wise as it has the ones we need built in!

23.976 - get that all the time.

Kevin

1 Like

[quote=“KevinJW, post:33, topic:4024, full:true”]
Interesting that when I read the LUT our of my EIZO set to sRGB I get the same curve as when it is set to 2.2, go figure I know we don’t program the monitor to do anything LUT wise as it has the ones we need built in![/quote]

You’re reading the internal monitor LUT? Interesting… does the EIZO software also change the ICC profile when you change the monitor setting and are you on Mac or Win?

Yea, one of the studios I work with just in the last year or so started doing everything 24 flat. I have no doubt 23.976 lives on if at least in that special hell reserved for endlessly repeating decimals…

Fascinating thread! Thought I’d mention that you can see what the color profiles are set to on a Macbook Pro M1 by going into the custom menu in the Display settings as described here:

Here you can see that the HDR P3 displays (Apple XDR Display (P3 1600 nit), Apple Display (P3 500-nit), HDR Video (P3-ST2084) displays use a pure 2.2 power function:

This is interesting and I remember you mentioned it on Slack. Our CG2730, upon being set to sRGB are certainly using the piece-wise EOTF. It is both measurable and verifiable with test patterns.

1 Like

Nowadays, technology allows us to calibrate our monitors to sRGB gamut. But do we need to do that?

When you calibrate your monitor to sRGB you get less shadow tones than you would with gamma 2.2. As a result, in an image with embedded 2.2 profile on an sRGB monitor in Adobe Photoshop, the shadows will be cut off: 8 bits RGB 5 5 5 5 will turn into 8 bits sRGB 0 0 0. Now about video. Video viewer apps usually do not have Color Management, so videos with gamma 2.2-2.4 (youtube) will look on sRGB-monitor not contrast, with light shadows.

In my opinion, a good option, monitor calibrate with gamma 2.2 (2.4), and sRGB images watch using Color Management :slightly_smiling_face:

My opinion, the sRGB standard was not originally intended to calibrate monitors. The sRGB standard was only designed to work in systems (apps) that support Color Management. In that case on a dim monitor with gamma 2.2 and brightness of 80 cd/m2 (in sRGB standard such monitor was called the reference display) in a graphic program with Color Management enabled any image without color profile or with embedded sRGB profile became lighter in shadows and details in shadows became better visible. The monitor was not bright and the shadows were very black, so I wanted to make them lighter :slight_smile:

In short, the monitor was calibrated with gamma 2.2, and all sRGB images were viewed on it only with apps supporting Color Management. For example, Adobe Photoshop.

This is directly contradicted by what is stated clearly in the document.

Damn, I thought my opinion would solve all the problems :slight_smile:

We need more lawyers of colour to clarify what is written in this document :slight_smile: