Gamut Compression - Checking In!

Hi all,

During our meeting today, we ended up in a discussion around a couple key topics, and we thought we’d take the opportunity to do a bit of a summary post, separate from our meeting recaps, to chat about where we are, the workflow decisions made thus far, etc. We want to make sure we’re still aligned on implementation plans, as we are well into writing our User Guide, and starting the Implementation Guide.

The discussion was centered around three main points, in my opinion, though there was certainly a lot going on. Feel free to comment and add things if I’ve missed anything.

  1. Should VFX Pulls have the gamut compression baked in?
  2. Should EXRs or other scene-referred media with the gamut compression “baked in” be called something other than AP0?
  3. Is the default of “always on” in the viewing pipeline too heavy handed?

1 - VFX Pulls

This was definitely a topic of much discussion early in the group. We talked about it as early as April 1st, and the notes state that the group agreed to not bake the compression into the EXR VFX pulls for a couple main reasons:

  • VFX needs the flexibility of where to apply it in their comp
  • We want the inverse to be a last resort, not a norm
  • If you do need to inverse the transform on a half float exr you receive, you might get quantization issues
  • Maintains the ACES structure of maintaining the largest data set possible for the longest amount of time

The main note here is that while we are suggesting that ACES shows almost always have the gamut compression ON in their VIEWING pipeline, we maintain the ACES philosophy of only altering the scene-referred files as late as possible (either on return from VFX, or in finishing for non-VFX shots. This gives VFX and finishing the flexibility to define their workflows as they see fit.

The downside to this, brought up in today’s meeting, is that there is added work and complexity for both VFX and finishing. It’s possible that this could be simplified by “baking” the gamut compression into VFX pulls. However, @matthias.scharfenber , @nick and I don’t believe this benefit outweighs the points made above. We are open to dissent here!

2 - Is it really AP0? Should we call it something else?

If the files returned to finishing from VFX have the gamut compression applied, it is no longer a true “round trip” or full AP0 files as delivered to VFX. The architecture group discussed having the gamut compression as a part of a colorspace (AP2, anyone?) but decided against it for a couple of reasons:

  • We wanted to be as “light touch” as possible to the overall ACES system - to reduce implementation thrash in the short term, but also in the case, in a wonderful perfect future with improved input and output transforms, where we no longer require the gamut compression.
  • Keeping it as an individual operator allows flexibility in VFX and finishing.

Also, though the files have been modified from their original AP1 values - this is common in VFX and finishing. We compared the gamut compression transform to, for instance, a despill operation. Definitely changes pixel values, but for the better, and you’d almost never want to undo that operation.

3 - Are we being too “heavy handed” in our “always on” recommendation?

This is definitely a possibility! We would love input here. However, when we started working through the production pipeline, we quickly realized that if used on set and in dailies, post production would have to match. And without AMF fully implemented to easily tell whether the compression was applied or not, we decided the easiest approach would be to opt out - so “on” by default.

We also recognized that if used on set, it was likely to always be on - even if you have a DIT on set, you are unlikely to have time to / want to quickly turn the gamut compression on and off based on what you are shooting. The algorithm is designed to target out of gamut colors and only minimally affect those inside AP1 - but the visual consequences of NOT having the compression on are large and immediately apparent to everyone - and hard for the DP to work around. The goal is to instill confidence in ACES on set, so we have a greater chance of a solid color pipeline throughout production.

This turned out to be longer than I anticipated - but here we are. We appreciated the great discussion in the meeting today - tagging in @KevinJW , @joseph , @SeanCooper , @jzp and @ptr for feedback on my summary + thoughts.


Heresy! :wink:

What might be relevant to discuss here, is who does the pulls? If the pulls come to the VFX facility with the mapping applied, then there is not much that can be done at this point on its end.

It would only serve adding to confusion in a system that is already complex.

Probably a good recommendation to have with an explanation as to why, i.e. the one you gave about AMF and that in the future, this recommendation will be revised. The future ODTs are likely to make the artefacts less offending out-of-the-box, at this point and if it occurs, the compression might be only used on a per-case basis.



1 Like

The way I see it is that AP0 represents an encoding of the pixel values, not some ground truth of the scene. So gamut compressed AP0 is still AP0 to me, in the same way as graded AP0 is still AP0.


Even discounting dynamic range or linearity concerns, ground truth with the dimensionality reduction imposed by motion-picture cameras is to take with a pinch of salt. They are also not really designed to be measurement apparatus but to produce art, with all the biases that can be added to such endeavour!

  1. Should VFX Pulls have the gamut compression baked in?

I don’t think so, not by convention. Agreed that not baking in the gamut compression by default would increase complexity for VFX and finishing, but… life is complex.

Baking in the reference gamut compression would also necessitate immediately inverting the compression if one wanted to use the parametric form instead (or not use compression at all, obviously). If the reference compression were baked into the view transform stack, it could be losslessly bypassed with systems like OCIO, where prepending an inverse reference gamut compression transform just prior to an “always on” forward reference gamut compression transform would optimize out to a no-op; whereas baking the reference gamut compression into a plate would “break concatenation”, so to speak.

  1. Should EXRs or other scene-referred media with the gamut compression “baked in” be called something other than AP0?

I think so. I was thinking AP0’ (“Aye-pee-oh prime”), maybe? (To me, AP2 implies an alternate set of primaries.)

I don’t really agree that having a canonical name for the gamut-compressed AP0 image state would increase conceptual complexity any more than potentially having both AP0 and gamut-compressed AP0 plates on a show. In fact, I feel not having a canonical, universally agreed-upon name to differentiate the image states would drastically increase complexity + confusion. Whether or not the name is used in practice is one thing, but I think it would be a mistake not to define shorthand that can be used verbally and visually in diagrams.

Also – and correct me if I’m wrong – I don’t think AMF has a means to indicate whether the gamut compression is baked into a plate, or if it’s part of the viewing transform stack. I could see studios using EXR metadata (internally) to indicate whether compression has been baked in to a plate, regardless of whether or not such metadata is allowed for ACES2065-1 deliverables.

  1. Is the default of “always on” in the viewing pipeline too heavy handed?

I think it depends on who’s doing the viewing, and when. “Always on” might be more appropriate for DITs and dailies grades, but I can imagine Finishing might feel differently.

AMF has an applied flag, so once AMF is universally adopted it will be possible to indicate whether the Reference Gamut Compression needs to be applied or is already baked in.


It does :slight_smile: there is an “applied” flag in the tag that is designed for this purpose.

1 Like

The operator does not change the image state though, the image stays in the scene-referred state. Unless we want to define a state for every single LMT out there, I don’t think it is feasible nor practical. To push the reasoning, a LMT that changes saturation does some form of gamut mapping or gamut conversion, in that case, we would not really define a new name for that “state” and the infinity of “states” that can exist along the saturation path.

1 Like

Damn good point. Hmm. Alright, you’ve won me over.

I feel better about not having a name for “gamut-compressed AP0”, now that I realize AMF is, in fact, fully capable of describing both what’s already been applied to the plate, as well as the rest of the transforms in the viewing stack. Of course it does. That’s an aspect of AMF I had failed to appreciate, and I’m now thinking a bit differently about all this. Should’ve done my homework.

AMF is totally sufficient for communicating what needs communicating – naming the “states” between LMTs is redundant at best, and really only provides room for error.

I’ve seen the light. I’m a changed man. Carry on :slight_smile:

1 Like

Ahahaha! Thank you for the questions and opinions, this is why we made this post - to check in! Appreciate you, @zachlewis!

1 Like

I’ve read about AMF here on ACES Central, but I work in ACES as a colorist, and I’ve never ever faced it in my work. And I’ve never seen anything related in Resolve. Could you please tell a bit about how it is implemented from a user point of view?
I’ve never ever heard anything about AMF from any colorist I know. And of course there is nothing about it in Resolve manual.
Or it’s just not supported in Resolve?

AMF is not currently supported in Resolve. It is not yet supported in many places, but the major software developers are all working on it, so hopefully it will come soon.

We won’t see its full value until it is available in every ACES supporting application.

1 Like

I found RGC in Resolve 17.4 clips values below 0. Is this made on purpose or this is just a one more thing that is broken in Resolve?

Jed, Nick and Paul Dore versions all preserve negative values.

I believe Resolve clamps negative values on input although I have no proof of this. IMO, Resolve is good at what it does but there is still definitely a lot of room for improvement. Is there anybody on this forum who has some pull at BlackMagic to get the most annoying things fixed?

I’ve posted this bug in the thread I created on their forum, where I decided to post all the color management and color tools bugs I know. Not all at once of course, because that would took a day or two.

1 Like

They didn’t fix this with the latest update, but they’ve finally fixed another bug with Canon Cinema Gamut wrong white point I’ve reported in the same thread right in the next message. So they definitely have seen the previous message showing the bug with RGC.

I have verified that this still occurs in 17.4.1. I will check 17.4.2 as soon as I get a chance, and follow up with the Resolve team.

1 Like

Hi all,

We have been working on updates to the User Guide over the last few weeks. We’d be grateful if the group could take a look and comment / provide feedback here.


It’s still there in 17.4.2 build 9 unfortunately.

As well as applying calibration LUT before ODT (to AP0 Linear) during playback and turned on “Hide UI overlays” option which removes latency from Decklink card. This bug is there I guess since ACES was added into Resolve.

This has been fixed and the fix will appear in the next release of Resolve. Either 17.4.3 or 17.5 which ever comes first. Thanks Anton for pointing this out.