Path: ACEScg to regular sRGB? (Nuke 10.5v1, ACES 1.0.3)

First post, hopefully in the right category.

I’m a bit lost at the moment. I’m done with my first ACES project (personal one to learn ACES) and now it’s time to get it to “real world” space. The destination of the end result is iPad in sRGB 8-Bit, PNG sequence.

Now when I just set the file out to Output - sRGB it looks nothing like what I see in Nuke when I look at the ACEScg through the viewer set to sRGB (ACES).

When I do an OCIOColorSpace(Output - sRGB to ACEScg) > OCIODisplay (ACEScg to sRGB) the yellows desaturate quite drastically.

Neither option creates a usable output.

Using COLOUR RENDITION CHART v1.7 I wrote a 16-Bit TIFF for Resolve which always produces wonky results as there is no sRGB destination, only a Rec 709 (sRGB, sRGB, Rec 709). So these tests had to be done fully within Rec 709. Still wonky as it pumps up the luma like a lift operation.

I tried the same with 3DLutCreator which turned out quite terrible. No idea what it matches against but probably not the actual values. I tried with my own chart (for matching output of different rendering engines) which has the XRite values in sRGB, it even shifted that one.

Now my question is: Can somebody help me figure out a proper sRGB output that can be used for file out? Something that I, and others who use Nuke can add to the config.ocio and have a way to ship a product the same way it looked in the viewer. Similar to how it works in ICC space where we have a few options how to transform between profiles would be great.

Thanks,
Frank

Nuke 9 or 10?
If 9, make sure you have “raw” selected on the Write node, otherwise you will be getting a second hit from Nuke’s 1D transforms

1 Like

Nuke 10.5v1 thanks. I tried it nonetheless to see what happens.
Turns out it’s a more direct way the get the wrong gamma and desaturated yellow you get with using the sRGB ODT as an IDT. Chain is now OCIOColorSpace(ACEScg > Output - sRGB) and on the write node raw data checked.

The same path within Nuke (no file out) looks like this then.

How is everyone else doing it? ACEScc(t) to Resolve and ACEScc to sRGB there? I need to keep the alpha intact though. I’d prefer to have a straight end-to-end ACES workflow, especially since I’m doing all my still image 32-bit stuff in Affinity Photo now where I’ve also set up ACES 1.0.3 now.

I don’t understand why that would be. It is certainly not the case for me. If you select sRGB (ACES) as the VLUT, that is just applying an OCIODisplay node with display device set to ACES and view transform to sRGB. A write node with colorspace set to Output - sRGB is doing exactly the same thing. So the pixel values sent to the screen and the file should be exactly the same. And for me they are.

That should give the exactly the same result (and I have verified that it does - within the tolerances expected due to OCIO’s LUT based implementation, and Resolve’s shader based one).

That is one of the issues you describe, and I can’t replicate it. The second one is a known limitation in ACES.

The yellow desaturation is inherent in the “look” of the RRT. There is no ACES RGB triple which will result in an sRGB triple of {1.0, 1.0, 0.0} after the RRT and sRGB ODT. An ACES 2065-1 triple of {2.31, 3.0, 0.0} or ACEScg of {2.64, 3.35, 0.0} results in sRGB {0.95, 0.95, 0.0} which is a perfectly reasonable bright saturated yellow. So if creating sRGB display referred graphics, if you limit your yellows to {0.95, 0.95, 0.0} (basically limit all your sRGB colours to the inside of the distorted unit cube you show above) then they will transform through the reverse/forward RRT intact.

So thinking about it, what would be useful would be a gamut mapping sRGB to sRGB LUT, which limited its output to the bounds of the output of the sRGB ODT, while preserving visual appearance as much as possible. It such a LUT were applied to the sRGB image before ODT/RRT inversion, the resulting image could be used in ACES without suffering desaturation.

1 Like

Of course if working on a TVC, you may get into debates with agency/client types who have brand guidelines defining what colour the logo should be. But that’s no different to the situation that has always existed where you have to say “sorry that’s not a broadcast safe colour”. In fact, if you limit your sRGB palette to colours which would have passed a traditional “broadcast safe” check, graphics using those colours are likely to pass unchanged through the reverse/forward RRT process.

1 Like

I tried it in Resolve and get the desaturated yellow/orange range too. It also pulls the turquoise towards the blues, just like Nuke does. Affinity Photo set up with the same OCIO config does the same. So, not Nuke specific. As for the viewer, I’m getting the same as I get in the viewer if I set it to RAW and used an OCIODisplay node with sRGB out. But once I write out to sRGB, that’s where it breaks down. Same when I load the EXR sequence as ACES into Resolve (ACEScc config) and set the viewer to sRGB, it desaturates there too.

What I was hoping for is a workflow/workaround that does what perceptual rendering intent achieves with ICC profiles. This is often done through dithering when the image is changed to a colour space that doesn’t have a close enough match.

If I got you right it’s: Create a regular LUT that goes from ACEScg to sRGB (or just take the sRGB ICC profile and convert it to a LUT), apply that as a Vectorfield and then go back into ACES? Or put that at then end before the write node?

BTW I’m getting a little bit closer by converting the ACEScg to ACES, then applying a Grade node and a Saturation node before file out. I have to eyeball it for every shot but it’s visually close.

As for the TVC, brand guidelines should have that in their style guide / brand bible, just as they have Pantone > CMYK, Pantone to RGB. I haven’t seen a Pantone to broadcast safe though yet in any of them.

I’m giving the LUT a shot soon. I reverted my Nuke setup to nuke_default for now as I have to get some work done. Will experiment with ACES more when I’ve delivered.

Thanks Nick.

No. I mean a LUT you apply to the original sRGB which creates a modified “broadcast safe” sRGB which doesn’t contain colours which will get distorted by the ACES sRGB IDT.

Hmm, that’s the thing: There is no original sRGB image per se in this case. Or any cases in my foreseeable future.

My Workflow:

  • Base image comes out of Modo as plain EXR sequence
  • Sequence goes into Nuke, interpreted as ACEScg
  • Elements created in Nuke get composited in
  • A couple of days later it comes out of there to be used in an sRGB environment (app, iBook, web, game, still print). Long sequences as MP4 (4:2:0 via Resolve so it still looks OK), short ones as sRGB animated PNG or JPEG sequence / sprite sheets.

My apologies then. I thought you were describing the known issue where you have display referred sRGB graphics and you need them to appear in your final output as they did in e.g. Photoshop. Using Output - sRGB as an input transform accomplishes this for everything except certain highly saturated colours, particularly yellow. Because you referred to applying an OCIOColorSpace transform from Output - sRGB, I assumed that was what you were doing.

So you are saying that rendering to a TIFF in Nuke with colorspace set to Output - sRGB does not give the same result as rendering to an ACES 2065-1 EXR and then rendering that to an sRGB TIFF in Resolve in ACEScc(t) mode? I can’t replicate that. They match for me.

You refer to rendering to MP4 in Resolve. That does add an extra layer of complexity. You may find that you should select the Rec.709 ODT for MP4, not sRGB, even if the result is intended for viewing on a tablet/computer. Some (but not all) players assume that MP4s are BT.1886, and therefore apply a gamma adjustment on playback to make the image appear on an sRGB display as it would have done on a broadcast monitor. Compare e.g. DPXs from Nuke and Resolve to remove this factor.

1 Like

No worries.

No no… when I set everything up in ACES the behaviour is always the same. In Resolve’s ACEScc mode it’s even more directly apparent. As soon as I set the VLUT to sRGB I see the desaturation right away and can fix it manually by eyeballing it. Rendering it to ProRes444 then keeps the manually corrected look.

What I was referring to about rendering MP4s in Resolve was just that sometimes saturated colours render better in Resolve’s own RGB space to MP4 (MPEG4 .mov, repackaged with FFMPEG) than Apple’s Compressor does. Same colour space, yet Compressor washes the colours out sometimes where Resolve doesn’t. That’s what I meant by “(4:2:0 via Resolve so it still looks OK)”.

As this all behaves exactly the same in all three apps (Nuke, Resolve, Affinity) I think the right way for me would be to learn how to create either a custom RRT for good sRGB output or another sRGB transform with a corrective matrix added. Maybe fork the GitHub project if that ends up working well enough.

I looked some more into BT.1886 but that’s really marginal, even Compressor does the colour conversions reasonably well. Ideally the conversion to what I call “audience space” should happen there if it’d only support OCIO. Right now the errors I see between JPEG/PNG sequences and the mp4 clips are mostly related to 444 to 420 differences. The saturation and their inherent colour shifts are equally present in either.

Then I am afraid I am not following what it is that is happening for you, and what it is you expect to happen, so I am not able to help.

Are you just talking about rendered files not playing back in some other app in way which matches the image data you put into them? In that case that could be a player colour management issue, rather than an ACES one.

[quote=“frankjonen, post:1, topic:858”]When I do an OCIOColorSpace(Output - sRGB to ACEScg) > OCIODisplay (ACEScg to sRGB) the yellows desaturate quite drastically.
[/quote]Desaturate relative to what? What is your reference for what you’re expecting it to look like. You said earlier that “there is no original sRGB”, but what you are doing here is treating the input as sRGB.

Basically what I’m expecting is that I get what I see. Which kinda is the point of colour management to begin with.

In the viewer I see the colours in ACES (Nuke, Resolve, Affinity) and once I go ACES > sRGB I significantly lose saturation in the yellows. A frame rendered that I open in (Preview, Affinity in ICC, Browsers, colour managed app context, etc) has very little to do with how the image looks in the viewer (Nuke, Resolve, Affinity).

I can directly reproduce it every single time without fail. I go from ACES to sRGB using the given path with the sRGB ODT and it’s not even remotely close. When I transform from one ICC profile to another with perceptive rendering intent I’m at least somewhat in the ballpark. But this? It’s like an inkjet printer with a dried out yellow chamber.

When I said “there is no original sRGB” and treating it like there is one is simply because I can stay in ACES during work but eventually it has to go out into the real world, on real world devices. That’s where ACES still has a brick wall without a door. There is no path from high end to consumer. You can create the content, but you can’t get it out if you value how your product looks. That’s my problem at the moment that I need to solve.

When I use the nuke_default OCIO config for example, this isn’t an issue. What I see in the player is what I get in the sRGB deliverables. It just works. This is kinda what I’m expecting from ACES, just better.

See this is why I’m still confused. You don’t “see the colours in ACES”. You only ever look at ACES colours through a view transform, i.e. the RRT/ODT combination. And if you render through the same view transform the result should be the same thing you were viewing. I repeat, if things are set up correctly, the transform applied when you set the viewer to “sRGB (ACES)” and the one applied when you set the write node to “Output - sRGB” are exactly the same transform.

I don’t know if I understand enough yet to be helpful, but I wanted to mention that by default my install of Nuke 10.5v1 with OCIO Aces_1.0.1 does not have any viewer LUT called sRGB (ACES) only sRGB D60 SIM (ACES).
Could it be that this specific viewer lut is custom and not matching the output one? (Like in my custom config last week we had a rec709 (ACES) which was actually custom and not doing the proper thing)

I believe the non “D60 sim” version of the sRGB ODT was added to OCIO in the ACES 1.0.3 version, which Frank says he is using.

Nevermind, I was reading through the thread trying to find his version, didn’t see it anywhere, posted, then realized it was in the title.

This is how you should setup your Nuke in order to work in ACES properly. The image read is in ACES2065-1, the viewer is set to sRGB as well as the write node. The output of the Write node will give the same result as in the viewer.

1 Like

Edit: I can see the yellow slot lost some saturation, I guess it is consistent with what Frank sees


(Left: sRGB created by Nuke’s color management from DPX, Middle: Nuke Viewer in sRGB (ACES), Right: Write out using Output - sRGB)

That is the effect of the reverse/forward RRT I mentioned previously in this thread. It only affects particular saturated colours near the boundary of the sRGB unit cube, most noticeably yellows near {1.0, 1.0, 0.0}.

Frank has said that this is not the effect he is talking about. Notice the yellow patch is already desaturated in the Nuke viewer compared to the original, and the rendered image matches the Nuke viewer.

Indeed that’s what I’m seeing in the output, just not in the viewer.

To recap: My input is an EXR sequence from Modo at 32f, read as ACEScg, since it’s CG which if I got this right is correct usage.

My viewer is set to sRGB (ACES 1.0.3 config)

Throughout the comp I’m staying in ACEScg as all further inputs are read into this from the Modo renders. I even made the lettering in 32f and saved it as ACEScg (Affinity). So everything is the same.

When I set Nuke’s viewer to RAW and replicate the setup with a OCIODisplay node set to Output - sRGB I’m getting the exact same result as when I set the viewer to sRGB itself. The yellows aren’t desaturated there. When I put a write node right after that the out put is desaturated though. This output then looks like what I see when I set a Resolve session to ACEScc, load the exported ACES (export as ACES2065-1) sequence and set it to view as sRGB.

To verify it’s not a Nuke issue I replicated the same in Affinity Photo. It looks normal with the file set to ACEScg and viewed as sRGB. Once I convert to sRGB (same as export to file), I get the desaturation.

The odd thing however is: When I fix this in Resolve and export from there the colours look as they did in the viewer, even when I fix the saturation issue. So I don’t think that sRGB can’t represent it, but that there are difference approaches on going about it.

I’ve even considered making a couple of screenshots of the CMS Test Pattern at 100%, stitch them together and generate a LUT from there based on the difference between the write output and the screenshots. But that’s a bit of a last resort effort.