This is what’s happening with the current transform by default with the cusp smoothing enabled. The actual shape for the gamut mapping intersection is a rounded shape like the one in your example.
I tested using P3 as the limiting gamut for Rec.709, clipping the end result to Rec.709. As expected, the image mostly looks the same, and fine, but there are more “traditional” type skews, and especially visible with green. Here’s few comparison images. First image is v035 Rec.709, second is v035 Rec.709 with P3 as mapping gamut, and third one is ARRI Reveal Rec.709.
I usually take the 1-1-1 tagged QT in FCPX and use a colorspace override to see the result in HDR. The transform in the OCIO config does not even specify if it is HLG or PQ. I am confused.
With the OCIOv2 studio config from Nuke 14 this workflow works without any issue. Same for ARRI Reveal via a LUT or Truelight T-Cam.
I’m probably not going to be able to help much with this unfortunately. I’m not writing out to HDR, but simply viewing the HDR on my Mac Pro M1 from inside of Nuke.
In case it’s helpful, in Nuke I’m using the Nuke node for the CAM_DRTv35 and have it set to 1000 peak luminance, output encoding sRGB/P3-D65 (which is what a MacPro uses, but would be different in your case), and clamp output off.
The blue sphere shows a magenta highlight and with a lifted gamma before the LUT the whole sphere looks magenta.
The rendering was done in linear sRGB and after a while I might realised why this happens. The base color shader value is set to 0/0/0,800 in a principled shader. By converting the primaries in Nuke from linear sRGB to ACEScg, the pure tristimulus values in the shader and in the render get a new meaning in the EXR values in ACEcg.
I imagined the cusp smoothing as “effectively” the same effect, but can it be easily be constrained to the gamut boundary of a particular display?
I ask this because I thought it was implemented (or a version of it) long before Nick’s gamut boundary target/algorithm was implemented, or has it been re-configured to operate in/with that step?
How exactly does the cusp smoothing interact with the boundary finding algorithm?
My idea of trying some in-between abstract or “virtual display” primaries was mostly to serve as aid or diagnostic to help modify the settings/behavior of the compression since it is responsive to (and quite sensitive to) the position of the cusp.
We have more or less expected behavior in Rec709, but less than expected (not necessarily “bad”) with P3. May not be the biggest problem, but I am still curious what is required to get less clipping in the P3 output, and of course having more control or understanding of the boundary finding/gamut compression couldn’t hurt.
After I figured out how to interpret the nuke ProRes from Nuke in the right way in FCPX, here is a HLG playout of the same rendering in four flavours:
This clip plays fine for me as HDR in Safari on an iMac display and on Apple XDR (iPhone/iPad Pro/MBP) displays.
I wonder as the primaries in the render scene are not wider than sRGB, how it can be that the sRGB/Rec.709 image stills turn so much magenta, but the blue sphere in the HLG playout just appears blue.
If one squints while looking at the small balls, we can see that the v35 red and especially blue are less bright than Reveal, their core is more muted:
It’s only taking the cusp of the limiting gamut and then moving the cusp outward before rounding it. It’s moved outward to avoid reducing the gamut. If the limiting gamut was always larger than the display gamut, then it could perhaps be just rounded without moving the cusp.
The current rounding was introduced at the same time with Nick’s gamut mapper. The original gamut mapper had smoothing as well, but that was effectively rounding the resulting display cube corners, reducing the gamut. The current one was my attempt to avoid reducing the gamut.
The rounding is done after calculating the intersection to the boundary with a sharp cusp (triangle effectively). In Blink code it’s done in the find_gamut_intersection() function. The boundary intersection itself is an approximation.
I do remember trying to use chroma compression space primaries (P3-like) as the limiting gamut. The end result was similar to using P3 as the limiting gamut for Rec.709, if memory serves.
@ChrisBrejon post above got me itching so I “forcibly” found some cycles to create those images that I was meant to do for a long while (I would also like to shoot some skin references against the LED Wall). They are spectrally rendered using Mitsuba 3 using the Standard Human Observer. The 6 first columns are the primaries and secondaries of BT.2020 using gaussian SPDs (10 FWHM), i.e. 630nm, 571nm, 532nm, 493nm, 467nm, 630 + 467nm, the two last columns are 400nm and 700nm, bordering the visible spectrum. The SPDs have been normalised against the luminous flux of a similar gaussian SPD at 555nm but it is obviously not enough for them them to be equally bright. Next step is to render them using a virtual camera to generate physically non-realisable values and maybe address the brightness.
I am already using the new prebaked LUTs of the ACES 2.0 DRT for my YouTube HDR videos. I didn’t like the desaturation in the bright sky. So I started playing around with the Nuke script and the chroma compression settings. In doing so, I realized that chroma compression increases the saturation of certain red colors.
Is this behavior desired? I don’t think so. So I would like to suggest this little fix for the kernel (line 1438):
M *= Mcusp * sat_fact * scaleM;
if (M > JMh.y)
{
M = JMh.y;
}
This was discussed in the past two meetings. It’s not desirable and it’s the opposite of what chroma compression is supposed to do. It’s supposed to protect purer colors from being overly compressed, by either not compressing them or compressing them less than less saturated colors, but it is not supposed to expand the colorfulness.
For smoothness reasons we would prefer to not have ‘if’ style adjustments as this can lead to problems with gradients and other transitions from one colour to the next. Ideally we will want to reformulate the compression to not need this.