Background
I have been following the conversation about gamut mapping with great interest since it started. I have a feature film vfx perspective on the topic which might be useful. My background is compositing.
In Comp…
In comp the plate is king. All work we do strives to preserve the integrity of the original camera photography. Integration of cg renders, mattepaintings, reconstruction, cleanup etc. In the last few years it has become more common to adopt ACEScg as the working gamut. This approach has advantages. It helps with consistency of imagery from the cg pipeline. It helps with consistency of color pipeline between shows. But it also comes with some problems.
On quite a few shows over the years I have seen the same problem resurface. Highly saturated light sources and flares causing out of gamut artifacts. These negative pixels cause issues for comp. We are familiar with negative pixels in grain and shadow areas, but it becomes difficult to do our work when areas of the image that we need to do work on have this artifact.
I have seen the fallout of a few different approaches for solutions. Leaving the artifacts for comp to deal with can be dangerous and can require a lot of effort to QC comps and support artists. The “Blue light fix” LMT 3x3 matrix essentially shifts the working gamut of the show and can cause problems with cg not matching plates. It can also cause problems when the fix is reversed on delivery, with certain colors getting more saturated than intended. If you’re shifting the working gamut of the show why not just keep the original camera gamut?
Requirements for VFX
In VFX we are required to deliver back comps that exactly match the plates we recieved except for the work we did. For this reason reversing the gamut compression perfectly is critical. However I am acutely aware of the danger of “gamut expansion” as @daniele pointed out. Great care would need to be taken here to ensure things don’t go off the rails.
So to summarize the biggest things comp cares about:
1). Fix the issue so we can do our work.
2). Reverse perfectly so we can deliver our work back to the client.
3). Preserve linearity and hue of the plate as much as possible.
What would it look like?
How a gamut compression tool might be used in VFX is an interesting question. The process might look something like this:
- A 3d lut (applied in the color pipeline / OCIO), possible created with a Nuke node
- Precisely tuned and optimized for the camera gamut and problematic images of the show
- Applied on plate ingest
- Reversed for view transform
- Reversed on plate delivery
A Sketch of a Simpler Approach
All of that said I have been thinking a lot about @daniele’s comment from meeting #8 that the baselight gamut compression algorithm works purely in RGB without the difficulty and complexity of defining the neutral axis.
Given that
1). Cameras are commonly not colorimetric devices (the source of the problem in the first place)
2). Color accuracy is not critical as long as the gamut compression is reversed accurately, linearity is mostly preserved, and hue is roughly correct
I thought it would be interesting to play around and see what I could come up with utilizing a simple approach of rgb saturation.
Attached is a Nuke script with what I was able to build with my novice color science brain.
- A saturation algorithm is used to desaturate which is similar to Nuke’s “Maximum” luminance math, but weighted by color channel.
- A core or “confidence” gamut limits the area that the desat affects. The size of the core gamut is set with a threshold adjustment. And the maximum saturation needs to be set (that is, how far outside of the gamut pixels exist, which translates into how far above 1.0 the saturation key goes).
The attached nuke script has a node with a sketch of the idea set up, and keyframes to show the behavior on the example images.
For how simple the approach is I think the results are quite good. It would be great to be able to adjust weighting for the cyan, magental, and yellow directions as well but I have not yet figured out how to do that technically. (Help or ideas would be appreciated here!) Maybe there are other methods of adjusting saturation that would allow more precise control over “color direction”? Or other methods of controlling the direction of the hue vector?
Bonus
I’ve also included a version of @matthias.scharfenber’s Norm. HSV Sat. Softclip in LMS + Purple Suppression
method, with a softclip implementing the function that @nick shared in the Simplistic Gamut Mapping Approaches in Nuke thread - Based on all the testing I’ve done it seems critical to be able to precisely place the start and end of the softclip, and the end may very well not be at 1.0 if there are out of gamut values.
For those of us with a full Nuke license, there’s also a Blinkscript Gamut Plot node, which is less prone to frustrating unresponsiveness than the PositionToPoints + ScanlineRender plotting method.
The nuke script is available here:
Curious to hear what you all think and sorry for the wall of text!