Sampling a generated .CUBE LUT in ACEScc AP1

Hello,

Just wanted to start off by saying, this is all relatively new to me, I’ve been working with ACES and HDR for only a short while. I am currently in the process of color grading my game using a davinci resolve pipeline. In resolve, I’ve configured my ACES Input Transform to be “No Input Transform” and, switched the color science to ACEScc in the color management settings. I’ve written a custom ACES Output Transform (via a .dctl file) that applies the same conversion out of ACEScc, tonemapping algorithm, and Rec.709 conversion as I do in my game engine. I have built-in functionality to export EXR images from the engine in the ACEScc color space, and those are dragged in directly.

The issue I am having is, when I color grade my clip and generate a 65 point CUBE LUT, this LUT will not match the node that it was generated from, unless the option “Process node LUTs in…” is set to “ACEScc AP1 timeline space”. When it is set to AP0 Linear, the corrector node with the LUT doesn’t match the node that generated it.

For the life of me, I don’t know how to sample this .CUBE in my game in the same way that davinci does when set to “ACEScc AP1 timeline space”. I am able to match davinci if it’s set to “ACES AP0 linear”. This is probably because my process of sampling the LUT is very naïve, and linear. I don’t remap values at all, I don’t have any shaper functions. I simply sample the cube with my color (which is in ACEScc AP1), and that is my result.

Could someone help me understand what operations I should do to my color before and after sampling the LUT, to match the davinci setting that processes nodes in “ACEScc AP1 timeline space”.

I’ve tried remapping it from -0.35828683 to 1.4679964, which I understand are the min/max ACEScc values, this has not worked. Is there some kind of curve or shaper functions I should be using? Or conversion to another color space? Remapping between different ranges? My ACEScc value already has AP1 primaries, so I am not sure what else to do…

Thanks in advance,
David

Hey David,

This is expected because the grading operations are done in ACEScc so baking that into a LUT will require the LUT itself to be applied in the same space. If Resolve is set to process them in AP0/Lin it will convert the data to that first, apply the LUT, and jump back to the working space. This creates the mismatch.

Regarding implementation in the game engine that’s out of my knowledge but if you’d be able to convert to engine color to ACEScc and then apply the LUT just like Resolve it should match. Which game engine are you using? There are other users here with Unreal knowledge that can hopefully help you further.

Hey Shebbe!

So the implementation in engine is my own, neither unreal or unity. Just something I’ve been working on, and I do currently convert to ACEScc before applying the LUT (then convert back). Rendering happens in an HDR linear space with Rec.709 primaries.

The issue is that, even when I convert to ACEScc, apply the grade, and convert back, it matches the result in Davinci Resolve when it’s set to AP0/Lin. So this is why I was wondering, if there’s any extra process to shape the data.

The image I exported is in ACEScc, with the same algorithm used to convert to ACEScc before sampling the cube, so those should match. I’ve also written the DCTL tonemapper that I’ve told davinci to use as the output device transform, to include the same ACEScc back to linaer transform (then tonemapped then gamma corrected). So to me it seems like the only variable left is, how the cube is sampled. In code, sampling the LUT requires a position, normalized from 0-1. But ACEScc is not 0-1, so this is why I thoguht there might be some preprocessing done to the data before it’s sampled.

When I sample just using the ACEScc value I have, this is when it matches the AP0/Lin option in Davinci,

Hope this clarifies! or maybe it raises more questions than answers :thinking:

David

Hi all, just to close this since I’ve resolved it now,

I was not aware that the default color space for Davinci Resolve is AP0 Linear. So my incoming data was already converted to ACEScc with AP1 primaries. This is because I’d set the “Color Science” dropdown to ACEScc. So I thought I had to import my data this way.

I believed that the input device transforms would convert everything into the ACEScc AP1 color space.

What I didn’t realize is that the conversion happens immediately before each node, and goes back to AP0 Lin after each node. And the option to process nodes in “AP0/Lin” was in fact, not doing this conversion to ACEScc. This is likely incase you have baked a LUT acting as an IDT or ODT and it shouldn’t be processed in the ACEScc or ACEScct color space.

So, to fix my problem, I am now importing linear Rec.709 data. I used a custom DCTL IDT to convert this to ACES AP0/Lin. And my custom output DCTL is no longer converting from ACEScc AP1 and tonemapping, instead it just converts from AP0/Lin to Rec.709 and then tonemaps this.

And this matches perfectly between my engine, and Davinci Resolve! The TLDR: Everything is in ACES AP0/Lin, and the “Color Science” dropdown is just the color space that your work is converted into and out of when processing individual nodes, with the “Proces Node LUT” option being identical but specific to nodes with LUTs,

Thanks! :slight_smile:
David

1 Like