Notice of Meeting - ACES Gamut Mapping VWG - Meeting #26 - 8/27/2020

ACES Gamut Mapping VWG Meeting #26

Thursday, August 27, 2020
9:30am - 10:30am Pacific Time (UTC-4:30pm)

Please join us for the next meeting of this virtual working group (VWG). Future meeting dates for this month include:

  • TBD

Dropbox Paper link for this group:

We will be using the same GoToMeeting url and phone numbers as in previous groups.
You may join via computer/smartphone (preferred) which will allow you to see any presentations or documents that are shared or you can join using a telephone which will be an audio only experience.

Please note that meetings are recorded and transcribed and open to the public. By participating you are agreeing to the ACESCentral Virtual Working Group Participation Guidelines

Audio + Video
Please join my meeting from your computer, tablet or smartphone.

First GoToMeeting? Let’s do a quick system check: []

Audio Only
You can also dial in using your phone.
Dial the closest number to your location and then follow the prompts to enter the access code.
United States: +1 (669) 224-3319
Access Code: 241-798-885

More phone numbers
Australia: +61 2 8355 1038
Austria: +43 7 2081 5337
Belgium: +32 28 93 7002
Canada: +1 (647) 497-9379
Denmark: +45 32 72 03 69
Finland: +358 923 17 0556
France: +33 170 950 590
Germany: +49 692 5736 7300
Ireland: +353 15 360 756
Italy: +39 0 230 57 81 80
Netherlands: +31 207 941 375
New Zealand: +64 9 913 2226
Norway: +47 21 93 37 37
Spain: +34 932 75 1230
Sweden: +46 853 527 818
Switzerland: +41 225 4599 60
United Kingdom: +44 330 221 0097Meeting #25 - 8/20/2020

  • No objections to the “bare bones” version
  • @sdyer : some revisions to the 2065-1 standard are coming, and the CC values change at very small values (5th decimal place)
  • For default values: @matthias.scharfenber showed his nuke implementation of deriving threshold and max distance values. We are going to merge these values into the repo as a starting place, with some noted caveats. Separate post incoming.
    • @daniele questioned 100% of camera (encoding) gamuts being pulled in as still somewhat arbitrary.
  • @Pablo_Garcia notes for Sony it should use full SGamut3 and not SGamut3.Cine
  • The clamp in the AP0 to ACEScg CTL reference implementation was brought up - @sdyer is going to look into the history of that and get back to us with possibilities around updating the reference

Recording and Transcript

Thanks for posting the recording and notes. That’s a brilliant demo from Matthias !

Apologies for not making the group yesterday, I had a couple of points to raise regarding Matthias’ demo:

  1. we should include some sample “stills” camera encoding primaries as they do get used in production for both stills and moving footage.

  2. Based on Daniele’s point about why map 100% of their gamuts as those probably just fall where ever due to the mathematics of their chosen test colours. Why not take an approach of assuming that whilst we don’t know the specifics we could assume that all the different vendors used real lights and surfaces and that the knowledge lies in the intersection of the encoding primaries. i.e. look at what happens to the polygon bounded by the encoding primary intersections.

  3. we should also throw in real display gamuts in the mix, especially P3 red.


Here is the Nuke script used for the demo.
(It uses several BlinkScript nodes that may not work with all GPU configurations, so there is also a version where the BlinkScript nodes are CPU-only):

model_default_values_visualisation_v03.nk (681.7 KB) model_default_values_visualisation_v03_blink_cpu_only.nk (693.4 KB)

1 Like

And here is a Python notebook implementation of the same calculations:

The results are the same as those from @matthias.scharfenber’s Nuke script in the first four decimal places.


I’m sure y’all think I’m obsessed with spheres by now, but I just uploaded into the folder (which Scott has to review?) these lurid items from some sporting goods store, courtesy of Bill Hogan. I think somewhere the color management went awry in the preview that this page is showing me since they are way, way more saturated than what it’s showing. Half of them are illuminated by normal light; half by (as I recall) a small UV flashlight, like the ones they use to check ID at an airport. Oh, and I’ll try to branch out in terms of shape.



I’ve made some more changes to the demo Nuke script and Nicks Python notebook implementation.

The differences are:

Nuke Script:

  • Sampling of max values for both thresholds and distance limits is now done via an expression and get updated dynamically if inputs change (no more messing about with the Nukes dreadful pixel analyzer).
  • Distance limit calculation now uses a different method that does not rely on sampling many values any more.

model_default_values_visualisation_v04.nk (1.8 MB)

Python Notebook:

  • Threshold values are calculated based on the ColorChecker24 ACES values as described in the ACES TB-2014-004 Annex B Table to match the values used in the Nuke script.
  • Distance limit calculation uses the same method as the Nuke script.

Results from the Notebook and the Nuke script now align to 7 decimal places.


The defaults in the repo are now changed to those calculated in @matthias.scharfenber’s Colab (rounded to three decimal places). This is the version that should now be used for testing, and those defaults can be used as a straw man for discussion.

1 Like

As mentioned in Meeting #27, we should put them in the repo, just putting that here so that we don’t forget :slight_smile:

Following-up Meeting #27, and extending on what @matthias.scharfenber was intending to do, I converted all the Adobe DCP Profiles I have to XML (over 800). After filtering out the ones that are bogus or unsuitable, I computed the limits for the Daylight Illuminant Matrices (usually ColorMatrix2) they are given with:

Distance Limits:
0.340550084103 0.803397983528 0.392630884727



Hi all

I’ve been following this from a distance in admiration, and have been experimenting myself with the Resolve DCTL and finding it very useful, especially with things like neon lights and even for manual gamut mapping from large gamuts to smaller gamuts like Rec.709. For me as a colourist it’s great to have flexibility in how this is done and what method is used. So thank you for the work already done! Its great!

Looking at the work Matthias has done, is there scope for image analysis to be performed by the DCTL on a frame to define the values being used as a starting point? Forgive me if this is well outside the remit of this project or capability of what a DCTL can do!

In the mean time, I remember seeing a table showing different camera spaces and suggested values for the thresholds. Can someone remind me where that is?

It would be quite a handy resource as a starting point for the settings before customising them if needed, and it would be great to use these as default starting settings for known gamut mapping scenarios.

I have been using the shadow falloff parameter or a manual luma key to remove the effect in the shadows and from my point of view the invertibility of my gamut mapping is not needed, so I would be happy for values that ignore the out of gamut extremes found in shadow noise if that is of concern.

I’d also love to experiment with the different gamut mapping math when going from working gamuts to 709 for example but as a lowly colourist I don’t know where I should start when it comes to the parameters as a starting point. Although I use my eye to define most things, it would be nice to know if there is any consensus on recommended settings to objectively preserve creative intent when going from a large gamut to a smaller one using this tool for known conversions like this…sorry if Ive missed this anywhere!

Thank you and keep up the great work!

Hello @Toby_Tomkins

This answer from Jed Smith ? Is that what you are looking for ?
Thanks !

Yes! That’s it! Thank you!

I see these are in RGB. Is there anything here that can be used to inform good ‘starting point’ gamut compression for these individual cameras with the existing DCTL implementation? How can these values be used to inform the CMY parameters for example?

Also I noticed in the latest version the ‘show mask’ (sp) option is gone for the shadow falloff. Can I ask why that’s been removed? Great to see separate control for each channel threshold though!

Thanks again everyone! Great work!

Hi @Toby_Tomkins.

The Google Colab linked above shows the calculations for the current default parameter values. If you want to calculate the values for a particular colour space, or set of colour spaces, just comment out (add a # at the start of the line) the unwanted colour spaces in the spaces list in the Distance Limits section and then Run All from the Runtime menu.

The latest version (“bare bones”) of the gamut mapper removes the shadow roll-off entirely. The thinking is that once we settle on the default values for parameters, the majority of users will be able to use it as a “black box” that is just on or off. Anybody who is concerned with the requirements of invertibility or has other reasons not to apply it identically to every pixel will be a high level user, and will probably not use the stock version, but will rather use the parameterised DCTL (or other) version from the GitHub repo and tweak it to their requirements, using external qualifiers as required in their grading / compositing system.

Thanks @nick! So we can use any spaces mentioned under ’ RGB Colourspaces’ ( to generate the Distance Limits?

After the distance limits are calculated what’s the consensus on best fit for threshold etc? Is that still up in the air or being left to subjective taste or is there now consensus for the ‘barebones’ / black-box version?

With regard to the shadow rollout mask, I’m not referring to the barebones version, it’s gone in the latest DCTL on the jelpod/gamut-compress master branch (updated 8 days ago).

Yes. As long as you add them to the import section above.

I see!

I think it might be useful if this could be translated to something outside of python for easier use by colourists and assistants. Especially for those independent/freelance owner operators who might not be savvy with python.