Thursday, May 14, 2020
9:30am - 10:30am PT (Los Angeles time / UTC-4:30pm)
Please join us for the next meeting of this virtual working group (VWG). Future meeting dates for this month include:
5/21/20 5pm
5/28/20 9:30am
Dropbox Paper link for this group:
We will be using the same GoToMeeting url and phone numbers as in previous groups.
You may join via computer/smartphone (preferred) which will allow you to see any presentations or documents that are shared or you can join using a telephone which will be an audio only experience.
Please note that meetings are recorded and transcribed and open to the public. By participating you are agreeing to the ACESCentral Virtual Working Group Participation Guidelines
Audio Only
You can also dial in using your phone.
Dial the closest number to your location and then follow the prompts to enter the access code. United States: +1 (669) 224-3319 Access Code: 241-798-885
@matthias.scharfenber, @carolalynn: Something worth discussing in the next meeting would be to perform the gamut mapping in BT.2020 space instead of ACEScg/AP1, reason being it would ensure that no chromaticities end up outside the spectral locus and it is the space that has the closest basis to ACEcg.
Please take a look at the compiled resources around the current state of algorithm development here:
For next meetings, we’d love if everyone had a chance to become familiar with the state of things - we’ll do our best to keep this area on the dropbox paper updates with relevant resources, as I know ACES Central is sometimes hard to keep up with. We’d really appreciate everyone coming to the next few meetings having tried out the work on the images in the dropbox, as well as others you may have access to internally at your respective companies.
Call highlights:
No significant callouts on the Progress Report, going to finalize this week
@joachim.zell proposed the creation a set of synthetic test images for convenient visual evaluation of gamut mapping
Filmlight E-Gamut is based on a collection of real world camera data and might be good candidate for generating a synthetic test image with plausible “out-of-gamut” samples
The IDT buzzer was used a number of times again.
@jedsmith’s well received algorithm proposal was discussed at length and sparked a discussion of the hue change vs saturation change
A side conversation focused on whether the default mapping target should be constrained by the spectral locus (e.g. BT.2020) instead of AP1. It was pointed out that in the AP1 RGB domain the spectral locus is a “fuzzy”, less significant boundary and real world chromaticities only become important in the display referred domain.
@KevinJW raised a final point that the group should not only focus on the effect of the algorithm on high energy, saturated values but also on how it affects the dark values in the noise floor and that tests should also be done using moving images instead of stills only. @joseph offered to generate some sample footage for this.
@Alexander_Forsythe offered to look for some original ACES test imagery to share with the group.
This is not true, the statement about real world chromaticities only becoming important in the display referred domain is overlooking physical rendering in both realtime and offline renderers. One simply cannot feed a physically based renderer with non-physically realisable values and expect to produce physically correct imagery.
Not only that, but negative values are obviously a no-go as input in any renderer, for example if using a plate directly for reflection/refraction purposes in a shot, the last thing anyone want is negative values here. The renderer will most likely clip them just to avoid chaos down the line, but still, you are starting with the wrong foot and erroneous energy levels.
Not if we do our job correctly, but some processes, e.g. spectral upsampling, can break or produce incorrect values if fed with non-physically realisable values, which is to say that the spectral locus is certainly not fuzzy in some applications.