Theoretically you could look at the DCTL code and do the same in OCIO with its transforms. A simpler way might be to use the DCTL to generate a cube LUT which you could just read into an OCIO color space. That’s how OCIOv1 worked.
Hey everyone, I think I may have found a solution that seems to work well with the HDR footage from my Galaxy s23. In Davinci Resolve, with the Davinci YRGB color management, after I imported the s23 HDR footage, Resolve set the input color space automatically to Rec.2100 ST2084. I found that this looked correct and from some research this seems to be the correct choice. Here’s how I then converted from Rec.2100 ST2048 to ACEScg:
We first want to go full-manual with color management. So you need to set the overall Color Science to either ACES or Davinci YRGB Color Managed. By doing this, I can then go into the color tab, right click my clip (the thumbnail of the clip - make sure the “clip” window is open) and you should have an option to “bypass color management”. Again, you need to either be in ACES or the Color Managed version of Davinci YRGB for this to show up. With this setup, we can now add our own custom color input transform by adding a Color Space Transform node to the clip’s color graph. These are my settings on this node:
- Input color space: Rec.2100
- Input gamma: ST2048
- Output color space: ACES (AP1)
- Output gamma: Linear
- Tone mapping: Davinci
- Gamut mapping: None
- Advanced: check “Apply Forward OOTF” and also check “Use White Point Adaption”
At this point, it doesn’t matter what your preview looks like. But if you want to preview it correctly, add another Color Space Transform node (after the first one) and use the following settings:
- Input color space: ACES (AP1)
- Input gamma: Linear
- Output color space: Rec.709
- Output gamma: Rec.709-A
- Advanced: check “Apply Forward OOTF” and also check “Use White Point Adaption”
Here, for output color space/gamma, you can use whatever is suited best for your specific display. But previewing aside, if you just want to export the shot in ACEScg, just turn off this preview node and export an EXR sequence. Again, it doesn’t matter what your project timeline color space or output colro space is set to since we are bypassing the color management for this clip.
I did a round trip from Resolve > Nuke > Resolve and the colors seemed to hold up pretty well. I always have a slight different between how Resolve and Nuke display certain shots but that’s a different issue for another post. So, this is my workflow for now. Not sure if it’s totally correct. I’d be curious to hear what other people think about this approach.
Hey Jacob, you are countering your own workflow with this method. You desire no management but you do so by enabling it and then bypassing it. Picking just DaVinci YRGB already means no color management is applied and thus the choice if you want to do things manually.
I don’t have any footage to test or compare but this doesn’t look right to me settings wise. When converting from display to scene I think inverse OOTF should be checked, not forward. I also wonder what result you get if you’d use an inverse ODT from the ACES transform rather than a CST.
Additionally, an intermediate should be AP0 ideally to keep the ACES workflow as recommended.
You mention a CST again to view your image. If you’d use an ACES ODT to view it, it would match Nuke’s viewer provided you also use an ACES config there. The idea would be to use ACES as the color pipeline not DaVinci’s tone mapper.
Hi shebbe,
For me I’ve done this.
Import footage into resolve left all settings for colour management on default
CST input
Rec709
Gamma 2.4
CST output
Aces AP0
Gamma Linear
Then exported as .exr
Inside of nuke I’ve set working colourspace to Acescg
Then I’ve imported footage as read node for footage use IDT Aces 2065-1
Then use a OCIO colourspace node: In aces 2651 - OUT scene linear Acescg.
Then used a OCIO fileimport and selected my desired LUT for rec709
Does this seem correct?
This step is automatically handled by the read and write nodes in nuke when selecting a colorspace.
And then as for the lut would it be easier to just make a custom 3d lut in resolve then from there use it in nuke as rec709 isn’t scene referred? So I would have to cst my rec709 to rec2020HLG?