Notice of Meeting - ACES Gamut Mapping VWG - Meeting #18 - 6/25/2020

ACES Gamut Mapping VWG Meeting #18

Thursday, June 25, 2020
9:30am - 10:30am Pacific Time (UTC-4:30pm)

Please join us for the next meeting of this virtual working group (VWG). Future meeting dates for this month include:

  • TBD

Dropbox Paper link for this group:

We will be using the same GoToMeeting url and phone numbers as in previous groups.
You may join via computer/smartphone (preferred) which will allow you to see any presentations or documents that are shared or you can join using a telephone which will be an audio only experience.

Please note that meetings are recorded and transcribed and open to the public. By participating you are agreeing to the ACESCentral Virtual Working Group Participation Guidelines

Audio + Video
Please join my meeting from your computer, tablet or smartphone.
[https://global.gotomeeting.com/join/241798885]

First GoToMeeting? Let’s do a quick system check: [https://link.gotomeeting.com/system-check]

Audio Only
You can also dial in using your phone.
Dial the closest number to your location and then follow the prompts to enter the access code.
United States: +1 (669) 224-3319
Access Code: 241-798-885

More phone numbers
Australia: +61 2 8355 1038
Austria: +43 7 2081 5337
Belgium: +32 28 93 7002
Canada: +1 (647) 497-9379
Denmark: +45 32 72 03 69
Finland: +358 923 17 0556
France: +33 170 950 590
Germany: +49 692 5736 7300
Ireland: +353 15 360 756
Italy: +39 0 230 57 81 80
Netherlands: +31 207 941 375
New Zealand: +64 9 913 2226
Norway: +47 21 93 37 37
Spain: +34 932 75 1230
Sweden: +46 853 527 818
Switzerland: +41 225 4599 60
United Kingdom: +44 330 221 0097

for those that asked:

OutOfGamut_ACES_Gamut_Mapping_VWG

Is what I showed, very basic and I’m not sure if it is ‘current’ in terms of the method of identifying the inside vs outside segmentation, but it is easy enough to add/modify.

Kevin

1 Like

Summary here, sorry for the delay:

  • We started the meeting revisiting testing procedures, and some test operations were defined but need fleshing out:
    • Filter, key, degrain, alpha blending, legacy blending modes (display referred), blurs/spatial operations
  • @joseph suggests a visualizer to show what is outside the safety zone in an image as an aid for testing
    • @KevinJW showed his work where he’s masking and desaturating colors to show what’s outside the safety gamut / AP1
  • It was brought up that special attention needs to be paid to workflows for animation and motion graphics where super saturated colors are common and desirable, and Kevin pointed out that motion graphics are not always sRGB these days, they’re often P3. We don’t want to make life harder than it already is to bring these elements into a workable ACES space.
  • Most of the discussion (there was a lot of back and forth, please watch the recording for full detail) revolved around implementation of the algorithm - where and how it should fit into the pipeline. There are basically two (it’s a bit more nuanced, but for the sake of discussion) options:
    • Include the gamut mapping algorithm as a part of every AP0 to AP1 colorspace transform
    • Create a separate additional operator for the gamut mapping algorithm, and define pipeline constraints (such as usage via AMF) to aid users in production application
  • We acknowledge that there is a lot to unpack and test here, and therefore suggest the start of a pro/con list in writing to facilitate further discussion. We will create another ACES central post in that vein, and a slot on the Dropbox paper so we’re all working on the same thing.
  • @joseph shared the ARRI workflow diagrams as a visual aid to discuss the position of the transform: https://www.arri.com/en/learn-help/learn-help-camera-system/camera-workflow/image-science/aces/post-production

Hi,

I was listening to the last meeting and getting slowly aware of the gamut problems with ACES,
also thanks to „troy_s“.

I wanted to give some feedback from a user perspective as a nuke compositor and an online artist on flame.
I am working for more than three years in commercials with ACES, a lot of them were car commercials.
The main problem areas are the car paint and the head- and tail lights as they have mostly LED lamps nowadays
that result in the highest values. Add anamorphic lenses into the mix and this often creates some nasty values around the lamps.

Some years back before we used ACES on regular jobs and were testing out the possibilities, I was testing around with a red tail light shot from a car. See the PDF from 2017. I used the HueCorrect node, a soft clip in HLS (but only for saturation) and a Color Correction node.
ACES in Steps_v04_gamut_only.pdf (6.2 MB)

ACES didn’t fix the problem of the high values then, but at least I was able to „massage“ the image to a better look. And this was already a big advantage for me at that time. The result was far from perfect when I see it today but I was happy with it.

Later in several projects we had to use regularly the neon suppression fix in Nuke, that we reversed back out at the end of the comp, if no one forgot to do it. Often the neon suppression fix was then used again in Resolve for the grading.

This means at some point the correction has to be applied anyway, because no one wants to see this ugly artifacts.
So I can’t see any reason that I want to get them back at any point.

At the end of the discussion I was looking at the ARRI schematic and I saw two places where these „extreme“ values should be fixed and this is the IDT and the RRT/ODT in my opinion.

As a user I don’t want to be presented with a bad looking image by any camera in the first place. I don’t want to be forced to use a fix somewhere in the pipeline. This is not user friendly. For me it is a bug in the whole system if I run into it often.

Here is an off topic example: If I take my DSLR with it’s simple sRGB view transform and my girl-friend next to me with an iPhoneX takes „better looking“ pictures at first sight on the display, I know with which photo I rather would like to start working in „post“ with.
Sure I grade my DSLR photos with Capture One later on, the quality is better altogether and I have more possibilities to tweak my images, but the temptation is there: why not start working with an image that looks better? Is is professional that I have to know about a „neon highlight fix“ matrix?

If only the IDT takes care of the problem, then no camera can create these extreme values anymore. But in comp I could easily create even higher values by adding or correcting elements. And in the grading process the moment I use gamma operations I also might increase values dramatically. That’s why I think the RRT&ODT must also take care of these extreme values and map them properly back, so that they cannot escape the spectral locus.

At last I have a question:

If I understand it right, no human visible color can be outside the horse shoe anyway?
And I would need 31 primaries to cover each 10 nm steps to describe the spectral locus more precisely?
I think I understand why the AWG green and blue primaries are outside the spectral locus. In this way I can cover
more „space“ for these color inside the spectral locus. But shouldn’t at some point all my possible colors constraint by the spectral locus?

I hope this is the right place to give this kind of feedback. And I am sorry if I might misused a color related term here or there.

Best regards

Daniel

Hi Daniel! Thanks for the comment and info, much appreciated! I’ll try and respond to a few things here, though there’s a lot to it!

First, we’re working on getting to the point of having some test images/operations for users like yourself to try out. We’d love your feedback at that point. It should address the issues you point out.

Thanks for the note about invertibility - definitely something we’re still debating. There’s the user side where they likely don’t need the inverse back, but there’s also the archive and posterity side which in the ACES standard is AP0 - so changing that is a big deal.

Yes! If you check out the doc @daniele wrote towards the beginning of the VWG, he points out 4 places in the pipeline where gamut mapping is possibly needed: https://www.dropbox.com/s/eazgy6lg7l4bofg/20200201_GamutMapping_some_inital_thoughts.pdf?dl=0

We acknowledge this group is only one part of a whole, and hopefully more work will follow in the IDT and ODT space (working groups starting soon!)

To your questions:

Correct. The spectral locus is a plot of the visible color spectrum, which makes that horseshoe shape. That’s a pretty basic response, but you are essentially correct.

Also true at a basic level. Camera sensors are not colorimeters, and none (yet) fulfill the Luther-Ives condition, and therefore have to make “allowances” in their design and implementation of “native” sensor colorimetry to scene-referred colorimetry (ACES IDT). Camera manufacturers can speak more to this - but note it’s not just Alexa Wide Gamut with primaries outside the spectral locus - all major cinema cameras are the same in this way.

There’s a lot more to expand on here but wanted to get these quick responses to you. Looking forward to having you on future calls!

1 Like

Hi Carol,

thanks for your detailed replies.

I am looking forward to try out some new possibilities soon to handle extreme colors like in LED car lights.

I studied the document from Daniele, I had to read it some times tough, but I think I get it now. And I think I tried to express something similar just in very simple words and without the knowledge of Daniele :slight_smile:

I will try to join one of the next calls again. The one this week is in the middle of my night here in Germany so I think I will pass.
But I will follow the conversation on the forum and looking forward if I can try out some solutions soon.

Best regards any many thanks again

Daniel

3 Likes