A fully red star on a fully cyan background... in ACES

Tags: #<Tag:0x00007f632ae81018> #<Tag:0x00007f632ae80f50> #<Tag:0x00007f632ae80e88> #<Tag:0x00007f632ae80dc0>


I came across the blog from @Troy_James_Sobotka and wanted to try out question #9 for myself. (https://hg2dc.com/2019/07/21/question-9/) “Why the F*ck is Linear Light Important for a Digital Artist?”

As I am working mostly in ACES, so I tried out this example in ACES as well and I got strange results.

But first, I started out in Affinity Photo 16-bit sRGB. I can replicate the result from the blog as expected.

The moment I convert the document to 32-bit float (sRGB linear), the problems go away.

But by assigning the ICC profile ACEScg to the document, so changing the working space to ACEScg, and having a red star with RGB 1/0/0 and the cyan background with RGB 0/1/1 with the view transform sRGB, this happens:

I am using the official OCIO/ACES 1.2 config.
Note: I am aware that Affinity Photo has a bug, where it cannot save an image properly with an ODT selected. I normally use it only with EXR files.

I am sure that I can “trust” Nuke more, so next, I moved over to Nuke 12.2v4 non-commercial with ACES 1.1.

I replicated the steps with only Nuke nodes and compared the Nuke-Default pipeline vs. the Nuke-OCIO pipeline side by side. For this I set the view transform to ACES (RAW), so that I can apply the view transform by myself with nodes. The tree looks like this:

And the results look like this: (left side Nuke-Default vs. right side ACES(sRGB)

I can fix the problem by using sRGB-linear primaries and convert these to ACEScg, hence lowering the values. The tree looks now the following:( I just added two OCIO color transforms from Lin_sRGB to ACEScg after each constant node)

Left is again Nuke-Default vs. right Nuke-ACES 1.1

I must say the first part of the tests in Affinity Photo I can kind of understand. Kind of.
But the Nuke tests show something different I think.
I guess it has to do with the ODT, but don’t really get why.

At last, I tried two more things. Replicate this test in Resolve. The result looks even more ugly with maximum ACEScg RGB values, because the blur of the image is applied in ACEScct instead of ACEScg. I checked this again in Nuke. This is now the left image of the side by side. And for the right side I forced the ACEScg comp first to linear-sRGB values (which doesn’t make any sense of course, because I start with values over 1 and some negative ones) and then simulated the Nuke-Default pipeline. Now I get on the right side again the same result as up with Affinity Photo in ACEScg (which doesnnot seems to work properly?)

I am confused.

I found out, the moment I choose a ODT with a bigger gamut, like P3 or Rec.2020, the problems seems to get “smaller”.

And I know that I am using a usually “display-referred” motion graphics example in a scene referred Nuke environment, which is not happen often I assume. But I guess in a rare case some of this problem could occur in a 3D rendering as well. I am using extreme values for sure, but they are “normal” ACEScg values.

I hope this is easy to follow. I can upload the files if needed.

Best regards


It’s an interesting effect, but not unexpected when you think about it. The mid point between [1, 0, 0] and [0, 1, 1] is [0.5, 0.5, 0.5] so is achromatic. Depending on the different curves you are applying before and after the blur, the achromatic region will end up larger or smaller, and dimmer or brighter.

The version where you convert sRGB linear to ACEScg is just another example of the way a legacy sRGB VLUT maps linear 1.0 to display 1.0, whereas ACES makes room for highlights, so maps 1.0 input to a lower display value. Hence the cyan background ending up dimmer compared to legacy sRGB’s [0, 1, 1].

And blurring in ACEScct is not energy preserving. Hence the dark band. That was the point of @Troy_James_Sobotka’s article – if you want to simulate the behaviour of light, you need to work in linear.

As for the version where you don’t convert sRGB to ACEScg, you are then using a fully saturated ACEScg red primary and cyan secondary. Putting fully saturated ACEScg primaries through the current Output Transform has known issues, and is something the Output Transform VWG is looking into (see @ChrisBrejon’s posts on the subject). Because the ACEScg primaries are out of gamut for sRGB, one channel hits clipping before the end of the blurred transition, producing that ugly hard edge within the blur. That is why using a wider gamut Output Transform mitigates the issue – the Rec.2020 primaries are almost the same as those of ACEScg.

Here is my Nuke script if anybody wants to look (VLUT “None”, OCIO ACES 1.1):
star_blur.nk (3.7 KB)

1 Like

Thanks Nick for your good explanations as always.

I am slowly understanding why this happens, but it simply shouldn’t happen. I am using the “system” as intended. Maybe not in a way how a camera could produce such values, but in 3D as @ChrisBrejon sets up his tests now, eventually yes. For a very colorful cartoon for example :slight_smile:

I am looking forward to the work in the Output Transform VWG.

Best regards

Hi, @nick @ChrisBrejon,

I have two more questions after giving the topic some more thoughts.

1.) Wouldn’t be an “ideal” result in this strange image from ACEScg (red 1/0/0 and cyan 0/1/1) to linear sRGB also result in red 1/0/0 and cyan 0/1/1 after a gamut mapping operation?

2.) Does an example like this one even matter? If I understand it okay, the red star is no plausible, because the red primary couldn’t even made with a red laser.
But at least the cyan color is real.

Hey @TooDee ,

sorry for the late reply !

  1. About question 1, I asked myself the exact same question, like three days ago ! Would a gamut mapping operator in an “ideal world” map the ACEScg primaries onto the display primaries ? I am afraid I don’t have an answer for that one…

  2. I also sometimes ask myself the same thing about my renders. Does it even matter to be able to light with ACEScg primaries ? Some supervisors seem to think so… About the star test itself, may I ask you : what are you trying to achieve ? Would you need ACES for such cases ? It seems to me that someone who’d want to put a red star on top of a cyan background would do it in Photoshop. Then of course, the blur issue would appear with the dark outline… And we’re back to square one.

I did try this test with the nuke-default color management setup (linear/sRGB) :

It looks okay to me. What do you think ? It’s just that I have a hard time wrapping my head around ACES and 2d medium/pipelines to be honest.

Sorry I cannot shine any more light on this,


Hi @ChrisBrejon,

About 1. (Your answer makes me think again :slight_smile: )
From a “number” standpoint it might makes sense that the before and after the gamut mapping values are the same. But the “position” of both colors are on a different place on the CIE diagram, so at least this means that these are two different colors. It hard to separate numbers and colors! :slight_smile:

about 2.
Here I can give two answers.
First, I was reading the blog from @Troy_James_Sobotka, found his example of the red star on cyan and had to try it out myself. I started with ProCreate/Affinity Photo on the iPad, they failed like described, but I somehow thought “come on, these are far more modern apps, they should have fixed that by now”.
Next, I know it works in Nuke the “right way”, but as my Nuke is by default set to ACES I was amazed that it also failed, although the comp-math is right. Therefore this issue raised again my interest in the ODT VWG and before already in the Gamut Mapping VWG efforts last year.

And secondly, sure you are right. This is motion graphics, so I would do that anyway in Affinity/PS or After Effects. But the same way as you are testing “pure laser” primaries for renderings, imagine there would be a new version of AE with ACES support. Or someone has an ACES project in Resolve and wants to add an ugly red star on a cyan background graphics in a title or something.
I bet that I would too choose 100% red 0% green and 0% blue for my star and 0% red 100% green and blue for my cyan background. Because I did it always like this. And then I would run into problems again.

What I learned so far from this whole example? There is more than one place in the image processing pipeline that has to work properly to give us a good result. And I was not aware that it could be the view transform.

Just a funny side note: (which is not ACES related)
In AE in a standard 16-Bit comp the “star” fails too until you check the “Blend Colors using 1.0 Gamma” in your working space settings.
But in Motion from Apple, which is now already almost 10 years old, does not fail at this test in a standard type project. It works by default, same as FCPX, as both tools linearize the assets on input.
As soon as I use a HDR project type, the overall result looks darker (different tone-mapping) and the star breaks too (I found out that Motion/FCPX uses a linear Rec.2020 working space when setting a project to HDR). But less, as @nick was suggesting. I view this example in Motion on a 2020 iMac 27 which is P3 and supports EDR.

Hi @ChrisBrejon and @jedsmith,

I could not help it and had to try out what happens if I use the “NaiveDisplayTransform7” on the red star too. I am very thankful for the gizmo and to be able to peek inside. It helps to understand and follow along the ODT discussions.

Nuke-Default: sRGB

Nuke-ACES (Rec.709)

Nuke-ACES naiveDRT (default)

Nuke-ACES (Rec.709) - but with reduced sRGB primaries instead of ACEScg primaries


I would say not, as the ACEScg primaries don’t have the same perceptual hue as Rec.709 primaries. And certainly Rec.709 primaries have a different perceptual hue to P3 primaries. So a gamut mapper which mapped the ACEScg red primary to Rec.709 red (slightly orange) or P3 red (deep “blood red”) for those two display standards would not produce images which looked similar.


thanks ! great answer Nick ! makes sense indeed ! I guess it all comes back to the display limits.
If we overcompress we will get gamut clipping. If we undercompress, we may not maximize the image density coming out of the display. Interesting questions !

1 Like

Interesting that Lars Borg just said the exact same thing during the meeting :

And where do I put for example, BT. 2020 green ? Do I make it slightly cyan in 709 space, or people want to map it to the green in the 709 space, which I think is wrong, right ?

Great minds think alike. :wink:



I hope this is my last question about this topic. But I really just don’t understand what I am seeing on the following screenshots. I think I understand why the first screen-shot has a fringe that is actually different from a fringe that I would see in Photoshop for example.


Just comped, no view transform.

Just comped, blurred, no view transform.

Comped, blurred, Nuke-Default sRGB view transform.

Comped, Nuke-Default sRGB view transform.

Nuke-ACES (Rec.709)

Just comped, no view transform. (Same as image 1)

Just comped, blurred, no view transform. (Same as image 2)

Comped, blurred, Nuke ACES (Rec.709) view transform.

Comped, Nuke ACES (Rec.709) view transform.

My question is: to me image 4 and 8 look similar at the edges, coming from the same comped result,
but what is causing the artifacts in image 7? I know, the ODT, but at which stage?

Thanks for the help.