sRGB piece-wise EOTF vs pure gamma

And this happens to be precisely as the specification prescribes, when read closely.

I’d point your attention to Mr. Motta’s research and contributions if the document itself were not enough.

And for the record, if one reads the specification and historicizes the document relative to BT.709:

1: “two parts”.

  1. Difference between “computer display” and “RGB spaces”.


  2. Discussion of “overhead” with managed systems.


  3. Clear discussion of the display colourimetry.



  4. Direct reference to missing of the display colourimetry / characteristics in reference to BT.709.


  5. Discussion of the “old school” compatibility of an encoding through to a display.

In fact, understanding how the ICC protocol is implemented as a “new school” or the “managed” component referenced in the standard makes it pretty clear that the intention is direct code value to display, with discrepancy between the encoding and the display.

In a modern system where the “overhead” is removed, the management would happen in the middle layer now with direct stimulus in and out being on either side. However, historically, it makes perfect sense, and in fact only makes sense, that the encoding differs from the EOTF to achieve the noise suppression.

2 Likes

To be fair the colour LCD ICC profile has two tone curves attached. An inward facing which is compound function and. Display Tone Curve which is pure 2.2 power law.

And I make my test pattern in linear light always with 0.0 in the checkerboard.
0.25 solid with checkerboard 0.5 and 0.0
0.5 solid with checkerboard 1.0 and 0.0

1 Like

From a purely pragmatic VFX perspective, in a world where artists are working remotely and the display chain can’t realistically be managed end-to-end (without shipping out hardware), I find it safer to use pure power inverse EOTFs, due in no small part to the fact that we’re invariably delivering content intended for theatrical environments / broadcast. It’s far from ideal, but in a worst-case scenario, I’d rather have artists complain that their blacks are lifted while they’re working, than complain that their blacks are crushed after they deliver – it’s sort of like Type I vs Type II error (false positives vs false negatives).

(And if I’m being totally honest, when in doubt, I tend to mimic whatever FilmLight is doing, absent a reason not to…!)

A little bummed to discover Display P3 uses sRGB (~2.2). Out of curiosity, what do you guys tend to calibrate your wide-gamut desktop displays to?

These days, Display P3 at 100 and 200nits, we also have sRGB 100 and 200nits.

1 Like

@jack.holm : Hey Jack, as I see that you are here, would you mind chime in about the sRGB 2.2 vs Piece-Wise thing if you think there is anything else to add besides your quoted message above!

Cheers,

Thomas

Hi Thomas and all,

It might take me a while to get caught up on 178 posts!

I stand by what I said in my email that was posted, although I am no longer Technical Secretary of IEC/TC 100/TA2. That is one of the positions I gave up as I moved into semi-retirement.

I can add a few comments before reading the posts:

  1. sRGB is for computer displays that are calibrated using a LUT in the video card, or color managed using ICC color management, so there is no need for or benefit of a pure gamma EOTF.
  2. The sRGB ICC profiles have always used the two-part EOTF.
  3. During the time CRTs were used as computer displays accuracy and uniformity were limited, but by current standards the difference between the sRGB EOTF and gamma 2.2 is large, reaching up to 9 counts in 8-bit.
  4. A lot of color encodings are called sRGB that are not really sRGB. My suggestion would be if it is not exactly sRGB don’t call it sRGB, just say what it is, e.g. 200 nit peak white gamma 2.2 EOTF 709 primaries (or whatever).
  5. In most cases SDR video displays should be calibrated to the ITU-R BT.1886 EOTF.
6 Likes

Thanks @jack.holm, great to hear it from you and sincerely appreciate you took the time!

Do you think that the standard should be amended to provide explicit and definitive information about the expected EOTF of the display? As you saw it is a topic that is all rage and people almost read it as a function of ambiant temperature. If you think it should be amended, what is the process to get that rolling?

Cheers,

Thomas

Thanks @jack.holm

I wish the sRGB standard would have been written with those clear wording.
Nowadays reality is slightly different; in the wild and many displays are closer calibrated to a pure power law in their sRGB mode.
So the term sRGB became quite ambiguous.

But at least the community becomes aware of that.

What I don’t understand though:
sRGB is written with a compatibility to “rec709 video” in mind.
Meaning that you should be able to stream an unaltered video signal to an sRGB monitor and it should just work (EOTF mismatch compensate the surround change etc).

To my eye the compound function is not suitable for that task where the pure 2.2 power law EOTFs works quite well, assuming that the video content was generated on a monitor with a pure power law 2.4 EOTF.

Any thoughts?

2 Likes

Interesting thing is (correct me if I’m wrong), display with 0.1 nit black (typical for ips with 100 nit white) calibrated to 1886 will give the exact match with sRGB curve :slight_smile:
So basically any colorist, who are working on 1886 calibrated IPS display, see the image with brighter shadows and overall average “gamma” closer to 2.2 instead of 2.4. this pushes the colorist to add more contrast, than it’s needed, if displayed on 2.4 power law EOTF display. And even with 2000:1 contrast ratio display (I’ve never seen an IPS display with this high contrast ratio without zone dimming), the shadows are still too bright and overall contrast is much lower than with 2.4 power law EOTF. And 2000:1 contrast ratio is still acceptible for Grade A display.
Then consumers watch this on their 0 nit black OLEDs with 1886 EOTF that gives basically 2.4 power law EOTF and see the image that is too contrasty.
I’ve seen a lot of wars about 1886 vs 2.4 power law for <2000:1 contrast ratio displays as well as srgb vs 2.2 power law. So I’m curious what do people in this community think about it.

to clarify BT 1886 is never used with the black offset in the inverse EOTF nor in the forward EOTF, afaik.
So most pipelines I know use in fact a pure power law 2.4 “encoding” on the wire when they say 1886.
The black offset in 1886 is not optimal.
I edited the post above to be more clear, thanks for this

Thanks! Basically my question was about how it is viewed (not encoded). I mean, it looks like this standard makes everyone see image different, in a bad way, depending on their black level.

which standard do you mean sRGB or ITU.BT 1886 or both :slight_smile:

There is nothing you can do to make a 1000:1 display match a 5000:1 display.
You can choose to make the contrast appear the same but only by scarifying shadow detail.
Or you keep the shadow detail and loose a bit of contrast.
I prefer the later and it comes for free in a relative-black system. Flare is actually a nice roll off function.

1 Like

I was taking about 1886. So, you prefer to see more shadows details instead of the overall contrast. Interesting, thanks! I grade with 2.4 power law, but in the end of a grading shift I switch to 1886 and make a trim pass to check the shadows if they are consistent and aren’t crushed.

I am afraid I am having trouble keeping up with the new posts, much less going back to read the older ones, but I will provide a few more comments from my perspective.

I don’t think BT.1886 graded content will be “perfect” on an sRGB display in the sRGB viewing environment, just reasonable. The straight line part of the sRGB EOTF was to avoid extreme slope, which caused problems in some color management systems. It also partly addresses the difference between the expected black point of 0.1 nit for 1886 and 0.2 nit for sRGB (with more veiling glare because of the higher ambient).

I tried for 25 years to get agreement to clarify the sRGB standard, but was always met with strong resistance. At this point it is out of my hands.

While it is true that many people assume a zero display black with BT.1886, it is only recently that this can be approximately true. Otherwise this assumption causes problems with clipping or elevation of black values when going between black-relative colorimetry and regular colorimetry.

Yes, what happened is most people sort of de-facto agreed to use black relative colorimetry, i.e. the black signal level is mapped to the display black and the 2.4 gamma is used. However, this EOTF only matches the BT1886 EOTF when the display black point is zero. The 1886 EOTF more accurately fits the mastering display measurements on which it is based. One can argue that the black-relative fixed 2.4 gamma EOTF is better at mapping the black than the 1886 EOTF, but it did not match what displays at the time were actually doing.

If you don’t like what a standard prescribes, you are free to do something different, maybe even something better, but in that case it is better to say what you are doing than to claim you are following a standard when you are not, even if most people are doing the same. Doing that causes us to have to spend time on these discussions, and in some cases inhibits progress.

5 Likes

I could not agree more. When we try to match SDR monitors with HDR monitors, we don’t put the formers in sRGB standard mode as this is not what the average user does. Instead, we leave them in demo floor maximum brightness setting which gives us an average nits level between 200 and 300 for SDR graphics white when converting to PQ (depending on the monitor we’re comparing to). Maximum brightness setting also compensate for sunlight glare at certain times of day so it has a practical use in addition to matching what the average user is doing :slight_smile:

(emphasis added)

On this point briefly, I’m not sure there are many, or any, monitors today set at 80 nits. Mobile devices have screens up to 1200 nits. Desktop and other monitors are commonly set at 160 to 240 nits or more. At least this part of the sRGB standard is obsolete.

And to add, when developed and adopted, sRGB applied to monitors connected to Windows machines. At that time, SGI was using a 1.4 gamma monitor, and Apple was using 1.8 and would not switch to 2.2 for another ten years, about 2008. Windows was not using color management, and math co-processors were just starting to be a standard item.

The piecewise sRGB TRC is kind of like having to support NTSC pulldown in HD for years even though NTSC was long gone. In other words, how relevant is it? Certainly to unwind images encoded with the piecewise, sure. And to re-encode the image data for compatibility.

My reading of the standard is that the piecewise to linear transform is there for image processing purposes, to invert the encoding transform. Otherwise it seems clear to me that the standard states the output is to a reference monitor with a nominal 2.2 pure gamma. It also seems clear the standard states how the image data should be encoded in a file or data stream, which is using the piecewise.

Importing images is one of the few places I use sRGB. I work almost exclusively in a linearized space, or a pure gamma working space in some cases. The only time I’m exporting to sRGB is for images for web content.

Monitors in the Wild

Every monitor I’ve measured uses a pure gamma curve. The IEC standard specifies a reference monitor with a 2.2 gamma curve as shown above (and in the intro as well). Manufacturers use the pure gamma curve. The NEC pa271ws we use come with their internal LUTs configured with a 2.2 gamma, though the LUTs are programmable and can be set by the user to any curve, including the piecewise, that is not the default. The other monitors here, Samsung wide gamut, and Viewsonic, among others are a pure gamma curve. All hardware calibrated and profiled.

In a research project I’m working on regarding contrast for web content, we’ve measured several displays and devices, and sending them a test image that was created with a 2.2 simple gamma, or sending CSS colors, and not using any color management, we haven’t come across any that seem to be using the piecewise, though we do have more to test.

The IDMS standard for monitors describes measuring the gamma of monitors, and makes no mention that I found regarding adherence to using the piecewise TRC in the monitor itself.
Unlink: www sid org/Standards/ICDM#8271483-idms-download

Workflows

The only use we have for the sRGB piecewise TRC is when importing an sRGB image, or exporting for web. We don’t use it beyond that. For print/pre-press we’ll be working in 16bit ProPhoto, AdobeRGB, and/or CMYK. For film work, our interchange format is linear float EXR, & delivery as EXR, 10bit DPX, or 16 bit TIFF, the latter two with some form of a log curve.

While we don’t do much web video, video is not usually color managed, and more often just streams to the monitor which is as mentioned nominally a pure gamma curve, unless calibrated and profiled to have something else, a case that is quite variable, one of the variables being whether the user agent does anything for video playback — on most mobile devices there is no color management, and and everything is just squirted to the display, which is assumed to be sRGB… but the curve?

As new color spaces are emerging, including for future web content, I personally had hoped to see an end to piecewise TRCs, which today are about as useful as NTSC pulldown. Then Apple made the odd move to use the sRGB TRC for DisplayP3. (?!?!)

TL;DR

Recent Apple products aside, display standards and displays we’ve tested are using a pure gamma, and this is echoed in standards for monitors not to mention the IEC standard itself. Color manglement may affect that.

4 Likes

Just did a factory reset on the Eizo CG247X I’ve got in front of me. By default it uses the piecewise sRGB EOTF (for sRGB modes). Same was true of the Dreamcolors at my previous employer.

I see both 2.2 and piecewise in the wild, sometimes from the same manufacturer. It’s super frustrating.

3 Likes

Yeah, wanted to mention Eizo, we calibrate ours using the sRGB piecewise.

1 Like

Yes, I do know that EIZO comes with the piecewise… and their competitor NEC comes set with simple gamma, though the Spectraview software does offer the piecewise (both the NEC app and the one they licensed from BasICColor).

And any high end monitor with internal programmable LUTs of course you can do what you will…

I think my point though is here we set all our monitors to a simple curve because:

  1. We work in linear space almost exclusively.
  2. We deliver in either some flavor of log or exr linear.

The only exception is delivering stills for web deployment, but that’s when it does not really matter, because… MacOS color management and/or A Dough Bee color manglement will end up displaying the same since everything gets “managed” based on the hardware calibrated profile to come out quite literally the same no matter the TRC.

So… if everyone had functional color management, we could quite literally discard the piecewise TRC, and just make sure all images are tagged with whatever space you felt like exploring on the day.

But color management is a problem for mobile devices due to overhead and powerdrain… so that dream is still a few years off…

…But here near the end of 2021, at least we are finally making movies at 24 flat and not 23.976…

Interesting that when I read the LUT our of my EIZO set to sRGB I get the same curve as when it is set to 2.2, go figure I know we don’t program the monitor to do anything LUT wise as it has the ones we need built in!

23.976 - get that all the time.

Kevin

1 Like