P3D60 to ACES-AP1 ACEScct DCTL

Hi all,

I am currently experimenting with CinemaDNG ACES workflows in Resolve. I’ve written a sample DCTL but I’m unsure of some things.

Does Resolve automatically and correctly debayer CinemaDNG raw to ACES? I’ve seen a number of different workflows for DJI CinemaDNG material and they all look different, so I’m unsure which is the correct method.

I’ve followed these steps for this example workflow:

  • Resolve YRGB project
  • Import media and debayer CinemaDNG to P3-D60 Linear
  • Apply DCTL (P3D60 to AP1 linear then run through the ACEScct encoding function)
  • Apply ACES transform on 1st node in Resolve (IDT : ACEScct CSC → R709)

The results look accurate but the colour is slightly teal tinted, on the scopes the red channel clipped at the bottom end.

My DCTL is here:

__DEVICE__ float linear_to_ACEScct(float in) {
  float x = 0.0078125f;

  if(in <= x) {
      return 10.5402377416545f * in + 0.0729055341958355f;
  }

  return (log2(in) + 9.72f) / 17.52f;
}

__DEVICE__ float3 transform(int p_Width, int p_Height, int p_X, int p_Y, float p_R, float p_G, float p_B) {

  float rOut = 0.6884158485f * p_R + 0.0415350535f * p_G + 0.0037331617f * p_B;
  float gOut = 0.2633994374f * p_R + 0.9457129834f * p_G + 0.0474919658f  * p_B;
  float bOut = 0.0481847140f * p_R + 0.0127519630f * p_G + 0.9487748725f  * p_B;

  float r_log = linear_to_ACEScct(rOut);
  float g_log = linear_to_ACEScct(gOut);
  float b_log = linear_to_ACEScct(bOut);

  return make_float3(r_log, g_log, b_log);
}

I’ve used transformation matrices from here

And I also tried to generate one using python colour-science but both my results looks slightly off.

import colour

p3_d60 = colour.RGB_COLOURSPACES['P3-D65'].copy()
p3_d60.name = 'P3-D60'
p3_d60.whitepoint = colour.CCS_ILLUMINANTS['CIE 1931 2 Degree Standard Observer']['D60']

acescg = colour.RGB_COLOURSPACES['ACEScg']

matrix = colour.matrix_RGB_to_RGB(p3_d60, acescg)
tsp = matrix.transpose()
print(tsp)

I’ve used ACEScg as my target space as it has the AP1 primaries but is encoded scene-linear. I thought I didn’t need a CAT transform as both colourspaces had the same white point, I’ve tried with and without and the results still look tinted,

If anyone has any ideas what I am doing wrong or can point me in the right direction that would be a big help.

Kind Regards,
Howard.

The matrix you are using is from P3 with DCI white to AP1 (ACES white – ~D60) with CAT02 chromatic adaptation.

P3-D60 is not included on that calculator, and you would not need any chromatic adaptation, since your source and destination both use D60 white.

You also appear to have transposed the matrix, which is not necessary.

The matrix I calculate is:

0.7514531572 0.1998772840 0.0486695588
0.0487241885 0.9383330098 0.0129428017
0.0039503886 0.0418174835 0.9542321279

Try that one

Although in your code for Colour you have changed the white point of your duplicated colour space, you have not set it to use a derived XYZ matrix (use_derived_matrix_RGB_to_XYZ) so it is still using the matrices from the original P3-D65 space.

Not to be pedantic, but you would actually need chromatic adaptation, because the ACES whitepoint is not D60 (D60 has an xy chromaticity coordinate of 0.321626 0.337737 and ACES white is 0.32168 0.33767. Therefore a more appropriate matrix to convert from P3D60 to ACEScg would be

0.751075367 0.200256129913 0.048668503087
0.048677769338 0.938380395322 0.0129418353
0.003948656967 0.041845103411 0.954206239622

(this is using CAT02 chromatic adaptation and rounded to 12 digits of precision)

Hi thanks for the replies.

These work much better. I’ve loop tested the DCTL with Resolve ACES and both are matching now.

I’ll have a look at my python code and see if I can replicate the matrix you have. What is the process for debugging and verifying/testing CTL and DCTL transforms to ensure that the output is correct?

Thanks, I thought that the DCTL needed the matrix transposed.

I just wanted to check another thing.

When debayering the CinemaDNG from the Inspire3 Drone to P3D60-Linear DJI recommend a 1.4 exposure change in the RAW tab in DaVinci Resolve. This is part of their CinemaDNG to Dlog workflow.

Documentation here

I assume that this exposure adjustment correctly maps the DJI sensor values to the expected D-Log values. So that middle-grey falls where it should and the exposure matches the non-raw D-Log media.

Do I need to incorporate something similar to correctly map the P3-Linear to ACEScct so that middle grey falls into the expected place for ACEScct? If yes, what is the best way to calculate this factor?

E.G:

  • p3-linear 0.18 reflectance
  • ACEScct 18% normalised CV: 0.414
  • 0.414 / 0.18 = Scaling factor of 2.300 to be applied to R,G,B values before any transformation.

Hope you can help.

Multiplying by 0.414/0.18 would not be appropriate. The ACEScct curve already maps 0.18 to 0.414:

>>> (np.log2(0.18) + 9.72) / 17.52
0.41358840249244228

If you need to apply a 1.4 stop exposure offset you would multiply your linear values by pow(2, 1.4) before applying the ACEScct curve.

Ahh I see, ok thank you. This is interesting.

So this DCTL is now matching a common mezzanine file workflow, which renders out to d-log.

Thanks for your help, H