ACES Gamut Mapping VWG Meeting #1
Thursday, January 30, 2020
11:00am - 12:00pm PDT (Los Angeles time / UTC-03:00)
Please join us for the first meeting of this virtual working group (VWG).
Dropbox Paper link for this group:
We will be using the same GoToMeeting url and phone numbers as in previous groups.
You may join via computer/smartphone (preferred) which will allow you to see any presentations or documents that are shared or you can join using a telephone which will be an audio only experience.
Please note that meetings are recorded and transcribed and open to the public. By participating you are agreeing to the ACESCentral Virtual Working Group Participation Guidelines
Audio + Video
Please join my meeting from your computer, tablet or smartphone.
First GoToMeeting? Let’s do a quick system check: https://link.gotomeeting.com/system-check
You can also dial in using your phone.
Dial the closest number to your location and then follow the prompts to enter the access code.
United States: +1 (669) 224-3319
Access Code: 241-798-885
More phone numbers
Australia: +61 2 8355 1038
Austria: +43 7 2081 5337
Belgium: +32 28 93 7002
Canada: +1 (647) 497-9379
Denmark: +45 32 72 03 69
Finland: +358 923 17 0556
France: +33 170 950 590
Germany: +49 692 5736 7300
Ireland: +353 15 360 756
Italy: +39 0 230 57 81 80
Netherlands: +31 207 941 375
New Zealand: +64 9 913 2226
Norway: +47 21 93 37 37
Spain: +34 932 75 1230
Sweden: +46 853 527 818
Switzerland: +41 225 4599 60
United Kingdom: +44 330 221 0097
Carol, Matthias, any prep you’d like us to do for this first meeting? Do you an informal agenda?
Hi Joseph! This first meeting will be pretty informal. We’ll be going over the scope and goals/expectations for the group, expanding on the proposal a bit, and making sure everyone knows each other.
We’ll likely open the floor towards the end for anything participants would like to share, and plan a bit of structure for our meetings going forward.
Hi guys! Nice to meet you all, great first meeting… I would like to put my hand up for contributing to the effort of setting up and maintaining our assets, versioning tools in order for everyone to be able reproduce the results from any version of the tests as we progress. I’m probably not the mosts experienced at this but I have been grappling with this problem with the development I do with the image scientists I work with and keen to come up with a good system. Who else would like to be handling this with me? Scott, Remi, Thomas? Sorry to leave names out that may be obvious for this challenge, this is a new community for me.
Apologies for not making the meeting I had a previous engagement.
Great to see some lively discussion, the phrase that jumps to mind is “Herding CATs: …”
Anyway I’d like to add my support for the clear distinction between scene and display referred mapping approaches, As well as the need to be clear on what the metric should be to judge good vs bad which I feel can be quite context specific. Conceptually to me we have at least 4 different ‘observers’ classifications:
- Camera sensitivities (Capture)
- Rendering sensitivities/primary encoding (Simulation)
- Editing/Modification working spaces (Editing)
- Display related (Reproduction)
These probably live on a spectrum from scene to screen but those are the set that I’ve considered as important.,
The closer to the screen, the more we could lean on existing research by the CIE/colour science community.
I suspect that number 2 is more similar to cameras than it is to item 3, but may benefit from usability factors coming from appropriate working space choice. I see number 2 as different to light capture with cameras as it is generally not necessarily physically constrained, and in the none-spectral rendering case needs different considerations.
It is likely that number 4 needs breaking into parts related to appearance modelling and reproduction across different environments Theater vs TV vs Mobile or SDR vs HDR.
Something I think it sensible to include but I know adds some complexity would be the effective film encoding coming out of the ADX encoding primaries as well as those from stills cameras.
I think that film adds cases that are not necessarily balanced/optimised for near D65 and may not lie as clearly on the line given by the green primary as outlined on the proposal document.
Thanks, Kevin! Definitely interested in hearing your experiences with ADX thus far, it’s a great point about the assumptions made on the proposal. If you have any recent ADX scans that are shareable, I think that would be a really important data point to have in our test set.
Hi team!! I’m really eager to meet again. It was really interesting from a colorist point of view. I asked to some of my clients and I can provide some out of gamut footage to test, I would like to know what you need(fomat) and how to share it with you.
Hi Fabian! Thanks for your interest, the notice for the next meeting is going up today - but will be this thursday, february 20th @ 9:30am PST. We’ll be alternating mornings and evenings on thursdays from here on out to accommodate time zones.
As far as footage format - great question. I have some ideas, but will put it on the agenda for thursday to make sure we all agree.