macOS Dolby Vision/Resolve/ACES workflow

Help! I am totally lost trying to set up this new stop-motion production for monitoring HDR content and eventually delivering in Dolby Vision.

We have access to a Canon DP-V2421 monitor which meets the Dolby Vision mastering monitor specs. We are supposed to deliver everything at the end as 16-bit TIFF files, in P3 color space, D65 white point, and the PQ transfer curve.

We are using Canon EOS R cameras, and can convert those .cr3 files to .exr in any color space/transfer curve I like. We are looking at using Affinity Photo as part of our processing pipeline, and I have test frames from it exported as .exr in ACEScg and Linear sRGB.

According to Canon’s instructions, I had the monitor set to our delivery specs using its internal hardware calibration settings. On a 2013 Mac Pro, following Dolby’s instructions, I created a project in Resolve 17.4 (non-studio version at the moment), and set preferences to enable 10-bit precision in viewers and “Use Mac display color profiles for viewers”. In project settings, I set the following:

  • Color science: ACEScct
  • ACES version: ACES 1.3
  • ACES Input Transform: No Input Transform
  • ACES Output Transform: P3-D65 ST2084 (1000 nits)
  • Enabled “Display HDR on viewers if available”

I imported my test frames as clips, and set their ACES Input Transforms to “ACEScg - CSC” and “sRGB (Linear) - CSC”, then created a timeline for each frame.

When viewing those timelines on the Canon, they look nothing like Affinity’s GUI viewer.

I have an X-Rite i1 Display Pro Plus colorimeter that I got to calibrate our VFX and stage monitors, so I thought I would use that with DisplayCAL to make sure that the Mac Pro was sending calibrated data to the Canon. I used its calibration to adjust the monitor’s white point to exactly D65, and the peak luma to 1000 nits, then ran the profiler to create a simple single curve + matrix display profile that macOS could use in System Preferences. Then I went back and made a second XYZ LUT profile, from which I could make a 3D LUT to convert from the monitor to P3D65/SMPTE 2084 with a hard clip at 1000 nits, specifically to use in Resolve. These steps were all what I had learned from the DisplayCAL forums and guides. In Resolve, I turned off “Use Mac display color profiles for viewers”, and in Project Settings I set the Video monitor lookup table to my new 3D LUT from DisplayCAL.

At this point the UI of Resolve started glitching like crazy and at random, flickering as I moved the mouse around, shifting the color of the viewer when the application focus changed, and suddenly making the entire UI look like I was displaying a linear image without any color space conversion. I surmised that I had made an error by doing my monitor profiling with the Canon’s internal PQ setting still on, and that this had resulted in out of whack values in the single curve + matrix profile that the macOS was using. So, I went into the Canon’s settings and turned off all of the internal hardware conversion - native color gamut, gamma curve/EOTF off. I reran the DisplayCAL calibration to tweak the white point back to D65, but found that turning off the EOTF in the monitor had limited the peak luminance to ~695nits. I settled for a target of 600nits, and recreated both the single curve macOS profile and the XYZ LUT profile + 3D LUT for Resolve’s Video monitor lookup table.

Still no luck in Resolve - no more glitches, but the color managed frames still look totally different from the Affinity GUI, so I don’t feel confident that I know what I’m looking at in Resolve. I get that Affinity is probably using the macOS display profile for its viewer and this could cause some difference, but I’ve confused myself to the point that I don’t know for sure.

Tomorrow we are supposed to get a Blackmagic Decklink Mini Monitor 4K card with a PCIe expansion box for the Mac Pro, along with an SDI cable so that I can hopefully be sure I am sending a proper 10-bit Full Range RGB signal as Dolby Vision requires.

I have three questions:

  1. When I install the Blackmagic card, what are the correct settings for the Canon monitor’s hardware, and what are the correct settings for Resolve so that I am properly viewing an ACEScg exr source output transformed to P3D65/PQ?
  2. What am I doing wrong regarding macOS’s Display Profiles and DisplayCAL’s 3D LUTs?
  3. When everything is set up correctly using ACEScct, why do Resolve’s scopes roll off everything to 1000 nits when raising the exposure? I thought the Output Transform was only supposed to affect my display. By comparison Davinci’s YRGB Color Management can be set to a Timeline and Output gamma of ST2084 with a Timeline working luminance of HDR 1000 and the scopes show values raising all the way to 10000nits.

Thank you anyone for helping to educate me.

I’m not familiar with Resolve HDR and MacOS profiles in viewers but can fire a simple question at you.
Are you using the exact same ACES set up in Affinity Photo via OCIO?

You should definitely have more luck using the reference monitor with Decklink as dedicated managed feed instead of UI and viewer.

I also see that they have a new version today that says it fixed an issue with HDR viewers on Mac. Maybe that helps.

Sadly the new version does not really help with the UI display of HDR content on my iMac 2020 (which uses EDR). But the images looks now less broken as before.
@alexfry explained the problem to me, but I am not able to fully understand it yet.

I am testing a lot with different footage (from ALEXA, RED, BMD, 3D rendered EXRs etc.). I do my tests with FCPX (over LogC) as their HDR/EDR view pipeline works without having to use an external reference display. This might not be a professional setup, but it seems to work quite good.

I assume you need the external Pro XDR display or the new M1Pro MBP’s to get Resolve’s UI showing a proper HDR signal in the UI.

The whole HDR/EDR topic seems to evolve and change all the time at the moment. I set up a website hdr.toodee.de to show SDR/HDR comparisons. The page worked fine on an iPhone 11 Pro, now on the 13 Pro it doesn’t anymore. It did not work on the previous iPadPro at all, now suddenly it works in Safari on the new iPad Pro.

@agentirons

Try exporting a PQ ProRes out of Resolve and watch it in the default QuicktimeX player. It should look fine. This is at least my experience so far. But I did not manage to see a proper HDR/EDR image in the Resolve UI viewer yet. Even the new Update does not fix the issue for me.

Best

Daniel

Set Resolve to ACES

  • Set the IDT to ACEScg or set the footage IDT to ACEScg.
  • Set the ODT to Rec.2020 ST2084 (1.000 nits, P3D65limited) for example.
  • Render out a ProRes with the metadata tags (9-16-9 - set to same as project)
  • I convert the ProRes to a HECV 10-bit file with Apple Compressor then and watch it via iCloud Drive on a iPhone Pro or iPad Pro or put the file on a USB stick and watch it on a HDR TV.

The image pipeline is similar your stop-motion workflow. Instead of Affinity Photo I used Photomatrix Pro to use 3 bracketed photos instead of only one. The clips shows HDR content via Apple’s EDR on an iPad Pro with the XDR display right from here in the forum in Safari.

Best regards,

Daniel

Be happy that it sometimes work on Mac. Resolve support for HDR viewers is abysmal. On Windows, they straight don’t support HDR viewers at all and I have to render PQ content to a MP4 then open it in a video player.

1 Like

Hi Shebbe,

Thank you for pointing out the new update for Resolve, I got that installed but haven’t noticed a difference yet (on my M1 iMac 2021).

I am currently looking at Affinity Photo next to Resolve on this iMac monitor with a fresh bracketed exposure image pair. In Affinity, the HDR merge was created using their default “sRGB IEC61966-2.1 (Linear)” ICC profile, and then I exported to .exr 16-bit half-float for import to Resolve. In Affinity Photo’s “32-bit Preview” panel, I have Display Transform set to “OCIO Display Transform” using ACES with the sRGB ODT. In Resolve, I am using the ACES 1.3 sRGB ODT. Affinity’s viewer is very saturated and red compared to Resolve.

If I change Affinity’s preview to “ICC Display Transform”, the colors now match Resolve’s ACES sRGB ODT, but still with a slightly higher exposure. It does not seem to matter to Resolve whether I enable “Use Mac display color profiles for viewers” or leave it off and use a Video monitor LUT.

@TooDee, if I render out an .mp4/h.264 of this frame with the tags set to Same as Project (ACES sRGB), Quicktime shows the video looking exactly the same as the Resolve viewer. Same result if I switch the project to ACES P3D65/PQ-1000nit and export as .mov/ProRes.

The reason any of this matters is that if our DP is making initial color adjustments in Affinity or creating a show LUT in Resolve, I need to know that:
A. He is not accidentally clipping the data.
B. The colors are not wildly differing from expectations when we get to Resolve.
C. We are able to view a shot in the proper delivery format so that we don’t get surprised at the colorist facility a year from now.

Wow, OK. I’ve at least tracked down the source of the Affinity - Resolve OCIO discrepancy.

In Affinity Photo, when you create an HDR Merge, the resulting image’s color space is constrained to the document profile (ie sRGB), with a linear transfer function. So no matter what, the color primaries of the image are going to be different depending on which document profile you used. In its OCIO implementation however, it is considering the Input Space to always be “Roles - scene_linear” regardless of the document profile, and then applying the selected display transformation such as sRGB. I suspect this means that in Affinity I am seeing a double transformation to sRGB when using OCIO. This would explain why using “ICC Display Transform” more closely matches the intended colors.

In Resolve, when you import that .exr, if you wanted to match Affinity’s OCIO transform you would need to set the clip’s IDT to the equivalent “Roles - scene_linear” for ACES, which is ACEScg. This is of course wrong, but it does make the colors match between programs. It seems that the closest you can get the two programs to match while using sRGB and ACES is to leave Affinity’s preview on ICC Display Transform and in Resolve set the IDT to Affinity’s document profile and ODT to sRGB (or whatever your monitor is.) If in Affinity I create a merge using ACEScg to begin with, then the OCIO display transform to sRGB matches Resolve’s ACES sRGB display transform with an ACEScg .exr, which I think confirms what I said earlier about Affinity’s OCIO always assuming input to be “scene_linear”.

Hi @agentirons ,

ah, you are using the HDR merge function in Affinity Photo?
If I have only one bracket, I usually use the raw converter of Affinity Photo to convert my Canon .cr2 files to lin-sRGB or ACEScg EXR files.

I found out that the setting for 32bit RGB ACEScg automatically converts your .CR2 image to the ACEScg working space and you can save it as an EXR.
Be careful with the export settings, depending on the filename rules it can happen that another matrix conversion happens on export.
If the 32bit RGB is set to sRGB linear, then the .CR2 file will be converted to linear_sRGB. To view it then through an ACES ODT, you need to convert the file with the convert format/ICC profile to ACEScg first.

When I have several brackets of one image, then I use the PhotoMatrix Pro app. There you are able to batch merge .CR2 files to EXR in linear sRGB or proPhoto (which I use). The proPhoto EXR files I usually convert then to ACEScg in Nuke.

I was replying but @TooDee beat me to it and explains it perfectly.

I tested your workflow on a Windows machine that doesn’t have the ACEScg icc profile.
In that case you’d have to convert from say Linear sRGB (default) to the working space inside Affinity to correct the mismatch with an OCIO adjustment layer.

Then in Resolve set the IDT to ACEScg on the exported EXRs and it should match.

Alternatively you could keep the merged document in Linear sRGB by turning off the OCIO layer before export and flag it as Linear sRGB IDT in Resolve. (Only availabe from Resolve v14 and up I think with ACES 1.3).

@TooDee correct me if I’m wrong but if you’d want to make adjustments inside Affinity pushing colors outside sRGB (because delivery is P3 HDR) you’d have to work in ACEScg as the document profile or make the adjustment above the OCIO adjustment layer?

1 Like

Oh, I am not sure. But I think it should not matter, because the EXR is simple holding light information and is unaware of the working colorspace. 10/0/0 in floating point is very much “red” indeed, but it depends on the working space, what it actually means.

I thought Window 10 has ACEScg support too. Does it not?

Makes sense thanks!

Adobe has them but Windows not.
image
Probably not too difficult to add it to Affinity, that would simplify things.

1 Like