I have a recurring problem with given graphics from clients in an ACES workflow.
For example if I get a packshot which has a sRGB 100% white backgorund and the product in front of it.
If I bring that in with a sRGB IDT and view it through the RRT/ODT everything looks flat and wrong. And white is no longer white, of course. I would need the original display-referred look.
You can grade that a little to it’s original look, but I never really get there.
The same happens with logos and any kind of non-ACES graphics that has to hit a certain, company defined look.
Is there a standard method for handling that?
I was thinking if maybe an Inverse RRT/ODT would lead to the correct look?
I saw that there is indeed inverse of RRT and ODT on GitHub . But I find no implementation of that in OpenColorIO or in Resolve.
Not in front of my Resolve system at the moment, but I believe Resolve’s Rec.709 IDT is the inverse of the RRT plus the Rec. 709 ODT. So an image should pass through unchanged.
For a short moment I thought that could even help with my Cinem4D Problem discussed in another thread. But when passing through the inverses everything above 1 get clipped of course. As we are now coming from a display referred file. Even if that EXR file has values above 1 they get clipped this way. Which is correct.
So perfect solution for my asked problem. Would really be great if that would be included in opencolorio, So I could avoid that detour through resolve when compositing with displayreferred footage in Aftereffects.
While the transform is not completely invertible, because the pipeline passes the image back through the forward transform after inversion, the effect is for the total operation to be very close to transparent. If you pass a set of colour bars through this process, they will not hit the targets perfectly, and the PLUGE will be clamped off. But for normal images, the error is not visible.
Yeah! That is why I tagged myself as pedantic . I thought it would be important for the record to make that note on the invertibility. Speaking of which, difference of Ampas Goldens syntheticChart.01.exr and syntheticChart.01_from_InvRRT.exr:
You can do it with OCIO. Just use an OCIOColorSpace node, and choose Output - sRGB as the input colour space.
This screen grab shows how a small amount of the extremities of the RGB cube is clipped by the backwards/forwards transform, but frankly you are unlikely to want to have such extremely saturated colours in your original sRGB image.
I checked with OCIO in Aftereffects and that works, too. Strange - why do the output transforms work as input transforms? Probably I haven’t understood how OCIO works
For my custom made test scene that I checked yesterday with Resolve that works well. But as Nick mentioned possible problems with yellows I took a packshot that has many bright yellow text in it. And that gets totally wrong. Much too dark yellows and even in the white parts there comes some dark pixels from the render that are not visible in sRGB. So like Ben said: Can be a big help, but obviously it cannot be really inverted without loss.
Some of the transformation can be inverted automatically by OCIO (MatrixTransform, 1D FileTransform ones, etc…), for others you can specify the inverse, which is the case for some of the OCIO ACES colourspaces thus they work bi-directionally.
I just wanted to let you know, that I had a mistake in my chain of transforms in Aftereffects. The inversion seems to work better than I thought before.
I can even preserve values over 1 if I put the inverse RRT/ODT as last step in the chain.
Great help on the way to a some day pure ACES workflow
Just wondering if you found anything for this yellow desaturation issue ? Is there a dedicated discussion somewhere about that topic ?
I did some tests myself to understand more precisely how and where this happens, using InverseODTRec709 -> InverseRRT -> RRT -> ODTRec709 on image with saturated yellows, eg. from logos / titles (from what I recall the “Global desaturation (inverted)” in the InverseRRT can produce out of AP1 yellows that get lost by the clamp right after), but didn’t have the time to go any further.
From what I understand the RRT/ODT Rec709 can’t produce yellow with maximum saturation in the first place so we cannot expect the Inverse process to preserve all that but maybe there is a way to have a more controlled desaturation ? Also I find it interesting that this is mostly visible only for yellows.
I’ll upload a kit for Affinity in a bit that solves this issue to some extend. I’ve successfully used it to convert a few sequences to something usable. Then put them into Compressor to get ProRes conversions that matched the colours in Nuke. It’s not a 100% solution for everything. But what I’ve used it on worked for me.
Doing a bit of documentation writing right now.
EDIT: It’ll have to wait. Somehow I managed to break the Library and thanks to Sandboxing I can’t find what to delete so it can start over again.
I’ve recently done tests with a variety of ‘target device gamuts’ such as Rec709/1886, P3 DCI, P3 D60, Rec 2020 etc. I’ve plotted out the boundaries and there are certainly limits imposed by the RRT/ODT combinations which restrict which colours can be reproduced, when starting from the ACES AP1 (or AP0) “gamut” (not really a gamut per se). As one would expect the wider the target gamut the greater the range of colours that cannot be created (the more the RRT/ODT limits).
This is kind of a natural outcome of the ACES image rendering and is something I’m hoping will be looked at in the next revision of ACES.
My desire was to facilitate device dependent image manipulation such as graphics insertion/maximising the available output gamut for creative freedom.
I would like to ask a question related to this. When rendering textured CG elements I can obviously convert the textures to ACES with the “Utility - sRGB - Texture” however they come out a bit ‘darker’ so I though we could use the inverse display transform be it Output - sRGB or rec.709. This however produces overbrights for sRGB values about about 0.81, great for mattes but not so great albedo textures.
So, has anyone any pointers on texturing workflows that preserve the appearance of the source texture better than “Utility - sRGB - Texture” ? I’m also open to accepting that sRGB textures will just be darker.
There should be no surprises here, if you pass a texture in domain [0, 1] through the RRT + sRGB ODT you will approximately end up in range [0, 0.8].
The ACES system is meant to work with scene referred values as input, which means you need plausible reflectance values to feed your shaders. There are only very few things in the world that have close to 100% reflectance, e.g. Magnesium Oxide approximately 97.5%, Spectralon over 99%. Snow can reach 90% and is around 80% in Antartica.
So the question you need to ask yourself is what that texture represents, if it is meant to be lit and represent reflectance values, it simply can’t go over 100% and a practical limit is 90%.
Here is a chart with some examples values (RGB are the linear ones):
If you are doing physically based rendering, I think it is part of the game
It is also possible to apply a rolloff after the inverse display transform but you will never get perfect result. Here is a function I use often for that kind of tasks:
Certainly no surprises, it’s good to get some confirmation. We had used SPI’s anim config for a while and there was a transform there that was visually similar to the inverse of the viewing transform suitable for textures but moving to ACES I think we’ll go with “Utility - sRGB - Texture”as it is intended.