Rec709 Nuke setup calibration

Noticed that ACES rec709 viewer in Nuke is quite different from the default Nuke rec709 viewer, so I’m looking for the correct setup for calibrating the monitor for ACES 709? Using DisplayCAL calibration software.

Thanks for any help!

The default Rec.709 VLUT in Nuke is just a 1D LUT, which clips any values above 1. It is the Rec.709 camera encoding curve, so if your linear working space in Nuke uses Rec.709/sRGB primaries you are creating the equivalent of pointing a simple Rec.709 video camera, with no highlight roll off, at a scene containing the (relative) scene-referred linear values at the end of your comp. This is not an approach I would recommend, and I would even go so far as to say that in my opinion the inclusion of that curve in Nuke was a mistake which they now only keep for historical compatibility reasons.

The ACES Rec.709 Output Transform is a a much more sophisticated display transform, which includes a colour space mapping from the ACEScg working space to Rec.709, and tone mapping to expand mid-tone contrast and compress the shadows and highlights. The aim of this is to produce an image on a Rec.709/BT.1886 display which is a good perceptual match to the original scene.

The Rec.709 Output Transform is designed for a screen which conforms to BT.1886, so you should calibrate your screen to that (or to a pure gamma curve if you are of that school of thought.) You calibrate to the display standard, and the Output Transform targets that. You do not calibrate to the Output Transform.


Hi @Marty_Blumen,

It is absolutely expected and normal! The BT.709 OETF does not do really do any tonemapping, it just encodes imagery so that it can be presented on an HDTV (or BT.1886 compliant display device).

Without diving into details, the ACES BT.709 ODT will do more, its intent is to display scene referred imagery in a faithful and pleasant (filmic) way on an HDTV, so it will tonemap your content with a S-Curve to increase contrast and rolling-off highlights among other things.



1 Like

I posted just to see @nick answered too! :slight_smile:

thanks so much! Calibrating to rec1886 is very very different to calibrating to rec709.

Whilst we are here - RV also has linear to rec709 built-in the Color menu. Is this the ‘bad’ rec709 like in Nuke?

Both standards are doing different things! The former is dedicated to image formation for HDTV while the later is specific to image capture with intent to be displayed on HDTV. Put it another way: a linear input is encoded (non-linearly) with BT.709 and decoded with BT.1886.

It is not the first time I hear this question: why Rec.709 (ACES) looks different than Rec.709 in Nuke? It looks like it is confusing to many artists… I guess we just need to spread the word: Rec.709 (ACES) includes a s-curve tonemapping that looks much better! :wink:

And because the Nuke Default Rec.709 VLUT is 1D only it requires the image data to already use Rec.709/sRGB primaries.

1 Like

I did not know that! Thanks for the tip! @nick

This also applies to sRGB and why it looks different to Aces-sRGB. I assume there is a tone-mapping and default primaries too.

Correct! Any of the ACES ODT has OCES values as input, and those are generated with the ACES RRT.

True. But important to remember that the RRT and ODT appear as a single combined Output Transform in OCIO and many other ACES implementations. So in the default Nuke ACES 1.0.3 OCIO config, the expected input to any of the ACES VLUTs is ACEScg, i.e. AP1 linear, not OCES.

Yes, the takeaway being that the effective ACES implementations do not have to reproduce each single discrete step of the ACES block diagram or the reference implementation: the diagram/reference implementation can be sliced wherever it is convenient from an implementation standpoint.

I fully agree with Nick and Thomas.

The problem you’re having with Rec.709 in Nuke is the same for many application software offering their “domestic”, non-ACES color-science into Rec.709 color-space in addition to ACES one (via Output Transform).
It all gets down to the fact that ITU-T Recommendation BT.709 document does not specify a lot of things that should be in current colorimetry specs, therefore different vendors implemented their own flavours of the Rec.709 color-space. And, by the way, “Rec.” stands here for “recommendation”.
Nick and Thomas, for example, mentioned different highlight/roll-off curves, but ambiguities exist on black-point compensation as well.
Most importantly, a bit of confusion in specifying the EOTF curve in the original ITU-T specs generated ill-posed transfer characteristics in modern digital implementations of Rec.709. Some use a pure gamma (i.e. exponential) curve but, even in the latter case, a few products have traditionally been applying straight gamma 2.4, whereas a few others have been using other gamma values in the range 2.2 through 2.6 ― and I’m also referring to calibrated monitor manufacturers.

The above is one of the reasons why ITU-T came up recently with a new standard to supersede BT.709: the BT.1886. Personally, I like to consider BT.1886 just a Rec.709 done right, with all ambiguities removed.

All in all ―ACES or non-ACES― interoperability is a key factor in color-science. So my suggestion for you is sticking with one, consistent, color management system throughout your workflow. You can find more consistent behaviour if your products have a BT.1886 setting but as long as you live in an ACES-color-managed world (without any tiny “escape”), things should be self-consistent by design.

Thanks. So this then leads into the next question: if we have a monitor calibrated to 1886 then to see what an image would look like with an sRGB, normal computer vimeo/youtube audience, what would be the best solution in Nuke? i.e. the viewer lut should always match the monitor calibration afaik

Apologies for bumping this thread but I would like to add my interpretation!

I agree the Nuke rec709 View lut is an inappropriate way for displaying content intended for HDTV. The Nuke rec709 lut applies a ~1.95 CAMERA transfer function to a display function. This is wrong if you are working for broadcast HDTV standard. Which specifies a 2.4 display function.

This confused me for a while until I read the original ITU BT.709 standard (2002) and the updated BT1886 standard (2011). Great bedtime reading btw :wink:. I can only assume the 1.95 lut is for CRT monitors which natively have a display gamma of ~2.35.

1 Like