ACES Gamut Mapping VWG Meeting #12
Thursday, May 14, 2020
9:30am - 10:30am PT (Los Angeles time / UTC-4:30pm)
Please join us for the next meeting of this virtual working group (VWG). Future meeting dates for this month include:
- 5/21/20 5pm
- 5/28/20 9:30am
Dropbox Paper link for this group:
We will be using the same GoToMeeting url and phone numbers as in previous groups.
You may join via computer/smartphone (preferred) which will allow you to see any presentations or documents that are shared or you can join using a telephone which will be an audio only experience.
Please note that meetings are recorded and transcribed and open to the public. By participating you are agreeing to the ACESCentral Virtual Working Group Participation Guidelines
Audio + Video
Please join my meeting from your computer, tablet or smartphone.
First GoToMeeting? Let’s do a quick system check: [https://link.gotomeeting.com/system-check]
You can also dial in using your phone.
Dial the closest number to your location and then follow the prompts to enter the access code.
United States: +1 (669) 224-3319
Access Code: 241-798-885
More phone numbers
Australia: +61 2 8355 1038
Austria: +43 7 2081 5337
Belgium: +32 28 93 7002
Canada: +1 (647) 497-9379
Denmark: +45 32 72 03 69
Finland: +358 923 17 0556
France: +33 170 950 590
Germany: +49 692 5736 7300
Ireland: +353 15 360 756
Italy: +39 0 230 57 81 80
Netherlands: +31 207 941 375
New Zealand: +64 9 913 2226
Norway: +47 21 93 37 37
Spain: +34 932 75 1230
Sweden: +46 853 527 818
Switzerland: +41 225 4599 60
United Kingdom: +44 330 221 0097
Looking forward to seeing everyone Thursday - please take a gander at the Progress Report before then if you haven’t yet had a chance!
@matthias.scharfenber, @carolalynn: Something worth discussing in the next meeting would be to perform the gamut mapping in BT.2020 space instead of ACEScg/AP1, reason being it would ensure that no chromaticities end up outside the spectral locus and it is the space that has the closest basis to ACEcg.
Please take a look at the compiled resources around the current state of algorithm development here:
For next meetings, we’d love if everyone had a chance to become familiar with the state of things - we’ll do our best to keep this area on the dropbox paper updates with relevant resources, as I know ACES Central is sometimes hard to keep up with. We’d really appreciate everyone coming to the next few meetings having tried out the work on the images in the dropbox, as well as others you may have access to internally at your respective companies.
- No significant callouts on the Progress Report, going to finalize this week
@joachim.zell proposed the creation a set of synthetic test images for convenient visual evaluation of gamut mapping
- Filmlight E-Gamut is based on a collection of real world camera data and might be good candidate for generating a synthetic test image with plausible “out-of-gamut” samples
- The IDT buzzer was used a number of times again.
@jedsmith’s well received algorithm proposal was discussed at length and sparked a discussion of the hue change vs saturation change
- A side conversation focused on whether the default mapping target should be constrained by the spectral locus (e.g. BT.2020) instead of AP1. It was pointed out that in the AP1 RGB domain the spectral locus is a “fuzzy”, less significant boundary and real world chromaticities only become important in the display referred domain.
@KevinJW raised a final point that the group should not only focus on the effect of the algorithm on high energy, saturated values but also on how it affects the dark values in the noise floor and that tests should also be done using moving images instead of stills only. @joseph offered to generate some sample footage for this.
@Alexander_Forsythe offered to look for some original ACES test imagery to share with the group.
This is not true, the statement about real world chromaticities only becoming important in the display referred domain is overlooking physical rendering in both realtime and offline renderers. One simply cannot feed a physically based renderer with non-physically realisable values and expect to produce physically correct imagery.
Not only that, but negative values are obviously a no-go as input in any renderer, for example if using a plate directly for reflection/refraction purposes in a shot, the last thing anyone want is negative values here. The renderer will most likely clip them just to avoid chaos down the line, but still, you are starting with the wrong foot and erroneous energy levels.
But if the rendering space is ACEScg, there won’t be negative values, will there?
Not if we do our job correctly, but some processes, e.g. spectral upsampling, can break or produce incorrect values if fed with non-physically realisable values, which is to say that the spectral locus is certainly not fuzzy in some applications.
I just submitted a DSLR image (Canon 5D Mark II) as we don’t have any, probably half the image is AP1 OOG and close to 70% of sRGB: