Thursday, November 5, 2020
9:30am - 10:30am Pacific Time (UTC-4:30pm)
Please join us for the next meeting of this virtual working group (VWG). Future meeting dates for this month include:
TBD
Dropbox Paper link for this group:
We will be using the same GoToMeeting url and phone numbers as in previous groups.
You may join via computer/smartphone (preferred) which will allow you to see any presentations or documents that are shared or you can join using a telephone which will be an audio only experience.
Please note that meetings are recorded and transcribed and open to the public. By participating you are agreeing to the ACESCentral Virtual Working Group Participation Guidelines
Audio Only
You can also dial in using your phone.
Dial the closest number to your location and then follow the prompts to enter the access code. United States: +1 (669) 224-3319 Access Code: 241-798-885
The meeting was spent going through the feedback from the first round of external testing by compositors and colourists. For details view the recording.
Remaining feedback is expected over the next few days, and a summary of the results will be posted on ACES Central.
I just watched the recording and noticed @Thomas_Mansencal comment about very saturated Full CG footage.
For the record, we do intend to use the Gamut Mapper algorithm on our ACEScg footage IF needed. We are still not sure how exactly we may implement it in our workflow. I agree it is a side effect of the algorithm, but it is fixing some issues we have seen in very saturated laser likes lighting.
I have observed some noise appearance after the use of the GM. Not sure why it happened though and if it would be an issue on animated footage as I have only tested so far on still frames.
Another question I have is if you need feedback from more compositors ? I could ask a few friends or even post on LinkedIn.
Okay, just finished to watch the recording. Looks like you may not need Full CG saturated footage after all. Great job guys ! It was a great demo and some very interesting feedback !
Chris
Yes, It was just me being out-of-phase for a very long month with long hours onset, having not caught up for a while with the group and forgotten that the test package was strictly restrained to Camera —> ACEScg.
As I was thinking about it more, we could have done some more spectral renders, e.g. render the CornellBox to put an emphasis on defects.
The problems are that no public modern motion pictures camera sensitivities is publicly available and pretty much no studio uses spectral rendering.
The main reason to raise this point is that when I did some spectral renders earlier this year, the decrease in saturation was not acceptable and would have required a larger power value which is probably against the trend you see with the real life footage, except for people with HDR monitors.
I’m trying to provision for the future here, and as I raises the point in the meeting, I think that we will need two versions, one for SDR, one for HDR, not only that but obviously reserve ourselves the right to change the parameterisation in the future.