Rookie - Setup ACES in Resolve with HDR Display

Hey everyone!

I’m starting my journey with ACES and color grading for HDR, and I’m starting to grasp all the theory and necessary setups for this. And as any journey learning new technologies there are some things that are not making sense to me right now, and just want to confirm whether my approach is correct or not. ​

My current setup is as follows:

So, the EIZO is set to PQ_BT.2100 (HDR over PQ) in BT.2020 @ 300 nits. I understand that for proper grading I’ll need a monitor that can reach 1000 nits, but for now, I just wanted to test the absolute values on this setup before moving forward on getting new hardware.

What I’m trying to achieve at the moment is figuring out the correct configuration of hardware/software, so for this test Resolve is set to ACES 1.2 with an output transform of REC.2020 ST2084 (1000 nits) which is what the monitor is set at. So if I understood correctly anything above 300 nits should look clipped on my display, right? That’s currently not the case. Instead I get a washed out/hue shifted image.

Windows is set to HDR color and the monitor where the grading and the Resolve GUI will be displayed will ultimately be the same.

Further down the line what I’m planning to do is feed an ACEScc video into Resolve Live and do the grading there to export a .cube LUT, so for now I’m trying to figure out if I’m seeing the correct information on my display and how to configure it. Any help would be greatly appreciated, thanks a lot ACES community!

Here’s how the test patterns look at the moment with said configuration

I’m not sure how UI HDR works in Windows (or even if Resolve can do it). I note that the option to “Display HDR on viewers if available” is missing from your screen grab of your project settings.

In any case, you have your Input Transform set to “None” which means you are telling Resolve that your source material is already in ACES2065-1. In fact it is in display-referred PQ, so your input transform is wrong.

What you can sometimes do if you have a display-referred image is use the inverse Output Transform as an Input Transform (i.e. choose “Rec.2020 ST.2084 (1000 nits)” as your Input Transform) so it goes through it backwards on the way in and forwards on the way out, cancelling each other out, and producing an output which matches the source. However that will not work in this case, as your source is a test pattern which contains values way beyond what an ACES Output Transform would produce.

I’m not quite sure what you are aiming to achieve, but I don’t think it is possible!

Hey Nick!

Thanks a lot for replying back, I was doing some research during the week an yeah, unfortunately seems that what I’m trying to do will not be possible for the simple reason that in the Resolve GUI the viewer is always set in SDR, I would need a Decklink card or an HDFury converter to display the correct colors and inject the HDR signal to the display.

I was aware of that “Display HDR on viewers if available” options but couldn’t find it anywhere, I’m running Resolve Studio 17.2 and was hoping that the viewers on the GUI would behave just as a YouTube HDR video would (displaying only the video player area in HDR). I wasn’t expecting having to buy another piece of hardware to display HDR with Davinci on an already HDR monitor.

The Input Transform is set to none, because I was explicitly setting the Input transform on the footage, which from the course it was Rec.709. What I was trying to achieve was:

Figure out the hardware/Resolve settings for an HDR grading project using ACES.

Knowing that my current screen which has a 300 nits limit, but accepts a PQ signal (which makes me assume that anything above 300 nits will clip, that at least will help me validate that the config is right, just will need a different monitor)
So what I did was:

  • Create empty project with ACEScc color science
  • Set the output transform to Rec2020/ST2084 (so it matches what the monitor is set at)
  • Feed content from DCC applications either rendered in ACEScc/ACEScg or plain sRGB
  • Apply input transforms
  • Preview the HDR grade within the Resolve GUI

I was expecting that while grading and looking at my scopes anything above the 300 nits would look clipped on my display but that never was the case. Hope that makes sense!

I don’t have much experience with HDR but maybe you can test to see if you’re actually receiving HDR by watching an HDR video on Youtube. Right click the video → stats for nerds and see if it’s streaming HDR. If the colors are still washed out instead of clipping or it’s streaming bt709 there’s something in the OS or monitor not setup properly yet.

I think for the most reliable setup/results you’d want to have a monitoring I/O card and purpose the HDR display for a clean full screen feed instead of the UI viewer.

Hey Shebbe!

Thanks a lot for the advice! I did check if I was receiving HDR output on the monitor, opened an HDR YouTube video (the color is arib-std-b67 (HLG)/bt2020) and looks clipped on some areas, which I was expecting as my display is only 300 nits.

Maybe the weird thing for me right now is where is that option of “Display HDR on viewers if available” ._.

Setting an Input Transform of “Rec.709” is only suitable if your Output Transform is also “Rec.709” as you are looking for the two to cancel out, making ACES “transparent”. But even then this does not work 100% and, as I said previously, when you have display-referred test patterns with fully saturated colours, they cannot all be passed through.

I think that may be macOS only (any experienced Windows user want to jump in here?) as it works in conjunction with the “Use Mac display color profiles for viewers” preference.

Hey Nick,

Thanks a lot for taking the time to answer, still setting an IDT of say, sRGB should be suitable if I want to bring that content to an HDR color gamut, right? How should I go in a case where my final ODT is a Rec2020.PQ but I want to include content mastered in sRGB and make it look exactly as it looked with an sRGB ODT in this new output. Would that be possible?

May I ask if its not too much of a problem, why the fully saturated colors cannot be passed through? I’d like to understand what would happen, does that mean that for example, going sRGB IDT> ACES working space > sRGB ODT, would result in different colors?

About the Viewer, it seems that the option is gonna be to get a Decklink Monitor card for this setup.

Hello, I went through the same long confusing process so I hope I can help. Resolve is actually really intuitive for ACES workflow I find.

First a note: Resolve does not let you monitor HDR through windows because HDR in windows is completely broken and non-standard, with non-existent documentation and it can change with windows updates. I has a built in tone curve that effects footage. Therefore they don’t let you monitor through it. It’s Blackmagic avoiding customer service problems I think. macOS follows industry standards.

On what your trying to achieve:
When setting ACES in project settings, don’t set a input transform at all unless ALL your clips are the same type! (As in all SLOG-3, all rec. 709 etc. ) Instead set the IDT to NONE in project settings and set the individual clips to their respective color spaces (images to sRGB, footage to Rec. 709/SLOG/ or whatever it was shot in) Set the ODT to what your final output is.

Saying the footage is Rec.709/sRGB with the IDT means that the you are telling the system all the footage and images are Rec709/sRGB, so it will not read the full amount of information and not do a accurate transform to your finished output.

Switching from a Rec 709 grade to a HDR one will never look the same. You need to do separate grades for each. Or get a Dolby Vision license and use their conversion software to generate a Rec. 709 pass from a HDR grade.

1 Like

Hey Geoffrey!

Thanks a lot for taking the time! I also found the logic and basics of ACES to be pretty straightforward, but then there are all sort of incompatibilities that makes me wonder if I’m doing the correct setups.

This point here is like super important!

Would’ve help to know that the GUI viewers of Resolve will not act as HDR (not even if you use the video clean feed) without the use of a separate card, which makes yet another addition to the hardware configuration. Took me some time to research and figure out why was that needed (I do have some film/compositing experience, but been working on game development for quite some time now) so setting up proper CC environments is quite a new ride for me.

This second point

Makes completely sense now that I know that the video feed will be different for an HDR monitor, but I think that I will not be able to judge how much of a difference will be until I setup the complete grading environment.

Thanks a lot guys for guiding me into this early steps of ACES and so far so good into what are the necessary bits and pieces to become a color science master!

Hi Omar, I think you have this figured out but here are my thoughts on your approach.

  1. If I remember correctly Eizo CG319X has 2 options (or maybe you can create 2 options with color navigator) for PQ viewing. One will tone map the source down to 300 nits - so you wont see much clipping but it wont be accurate PQ either. The other is accurate to about 300 nits and then clips. On the pattern you downloaded check the grayscale to see where it is clipping.

  2. I recommend that you use the ODT Rec2020 ST2084 (1000 nits P3 limited) for grading since the display cannot accurately show beyond P3 and that is what most services ask for in the small print

  3. As you have found you cannot use the GUI for this you need a BMD decklink card - since you can switch the eizo to HDR modes you do not need HDMI 2.0 support, but it is nice to have in case you want to attach it to a TV with brighter capabilities. Using a video card bypasses the OS color management and puts your resolve project in control which is essential for grading decisions

  4. As Nick points out the IDT for the HDR test pattern you are using should match the ODT when testing dynamic range. If you select rec 709 as the IDT, then the expected behaviour is that the pattern would look just like it does in rec 709 even if your ODT is set to 2020 PQ. In other words your CMS is saying please display this rec 709 source “correctly” in 2020 PQ. The idea of that pattern is to fill the 2020/ 10000 nit bucket and see how much of it the display can show, and how it deals with the code values it cannot show - matching the IDT and the ODT does that because the ramp goes from code value 0 to 1023. To see if your display is setup correctly try “Spaces” in Downloads — Finalcolor or use the original ACES supplied targets Dropbox - ODT - Simplify your life
    Good luck!

1 Like

Mr. Shaw,

Thanks a lot for all the detailed explanation. I just finished the FXPHD course today and though there is a lot of information I was able to understand pretty much all of the workflow. The problem was the hardware setup so some of the tests that I wanted to do would not really work because of that.

The example files and test patterns are a huge help to compare and setup things but it was until this post and recent days that I found that the GUI in Resolve will not send HDR data, unless I use a Decklink card or I use a Resolve in a Mac (or so I’ve read on some other forums).

You mean a Decklink card right? So far, for the purposes of what we’re after (which turns out to be an exotic mix of using ACES/Resolve Live for a video game engine), and now that you mentioned, a PCIe
DeckLink Studio 4K connected to an ASUS ProArt, should let me monitor the HDR output if I can set the PQ via On Screen Display, is that right? I was a bit worried because the Studio 4k lets me record and montior, but its HDMI 1.4 so it won’t send the HDR signal unless I get extra hardware, but that would only apply if I want the output on an OLED TV for example am I right?

Sorry for the long post, it’s been quite a ride learning all this new workflows and the help of the professionals in the community has been outstanding to help me move forward. Thanks a lot for all the help!

Cheers!

I think you need HDMI 2.0 for it.
A 4K Mini Monitor isn’t super expensive. I have one too and I’d love to test it for you but our TV broke and we still have to get a new one so I can’t confirm if it works smoothly.

Hey Shebbe!

Yeah, I did look into that, but that would mean that I also need the 4K Mini Recorder, and the current PC configuration only has one free PCIe slot, but if HDMI 2.0 is necessary I think we could do a reevaluation of the PC specs/config we have right now. But if you do have the chance to test it let me know! Will totally appreciate the input :smiley:

Cheers!

Yes - video card for resolve means either declink or ultrastudio from BMD.
If the video card supports HDMI 2.0 then resolve will send the HDR flag to the proArt or an HDR TV when you check “enable HDR metadata over HDMI” in video monitoring AND your timeline colorspace on the color management page is set to an HDR standard (in an ACES project you only need to set the ODT to an HDR standard). You can ustill use the studio 4k or other cards that are HDMI 1.4 as long as you can manually switch the display. You can switch the ASUS proART UCX and UCG displays manually, just match the display to the output colorspace and eotf. Another option (covered in the FXPHD course) is to insert the HDMI HDR flag after the video card with something like the HDFury vertex, which is an HDMI splitter.
One other caveat- make sure the HDMI cable is 2.0. They all look the same but if the cable is not 2.0 or better, it will not pass the flag and the trigger will fail

1 Like

Hi Geoffrey. The problem with HDR on Windows and Resolve ain’t that it’s non-standard and broken — that’s a misconception people have that date back from Windows 10 Redstone 2 where support was iffy and earlier versions where it was just unsupported — it’s that Resolve can’t do it at all because Resolve uses OpenGL as graphics API (I checked using RenderDoc) and HDR+OpenGL just ain’t possible on Windows. For full support, they would need to do a rewrite of their render pipeline using DirectX since the other possibility, HDR+Vulkan, works but is a bit fragile. I know because we have been shipping games that supports both HDR and SDR on Windows for years and we even support dynamically switching between modes when the window is moved from a HDR monitor to a SDR monitor or when the setting in the control of panel is toggled while the application is running (and I’m the one who did the code for that)

Of course, if you have a programmer on hand who is willing to implement a custom OpenGL32.dll for you that would use DX for swapchain creation and use Google Angle as a GL to DX translation layer. That wrapper would also pass a fake SDR backbuffer to Resolve and use a custom shader for converting non-PQ data to PQ when appropriate (this might need some reverse-engineering of Resolve). Be mindful that PQ being absolute is a pipe dream though. I have two different HDR monitors with different specs next to each other and it’s easy to see that manufacturers do whatever the heck please them with PQ data. Again, this ain’t a Windows problem : we were able to reproduce it with content that came from an external source.

2 Likes

UI in HDR mode in Windows is upconverted SDR UI. It works like so : sRGB_to_Linear(uiColor)->Rec709_2_Rec2020(linear color) * hdrScaleFactor->linear_to_PQ(boostedRec2020Color)

A simplified version that uses MS scRGB would skip conversion to Rec. 2020 before applying the HDR boost and would then divide the linear color by 80 and finally scan-out the linear buffer. Maybe that’s what Windows is doing for UI. It’s one or the other since 16-bit linear scRGB and 10-bit Rec.2020 PQ (HDR10) are the only two supported HDR code paths on Windows.

1 Like

Thanks a lot for the clarification!
I was actually able to do the setup with a Decklink Studio 4K, with the Eizo monitor just as you mentioned before, but we may need to consider the HDMI splitter at some point. As we are doing grading for videogames we may need to see the output at the same time both in an consumer TV and in our grading monitor.
So far we have mostly all the pieces for the setup, we are still waiting for the ProArt UCG to be available in Japan (been waiting for it the whole year!).
As always, great pieces of advice, thanks a lot!

This is pure gold Jean, thanks a lot for the very detailed explanation.
So far I was just left with the idea that Resolve’s viewer was part of the UI and thus will always render in SDR but didn’t know all the details of why it would not be possible to have the HDR output. And the thorough explanation for how this could be solved is immensely appreciated.
On a very curious side note, which games have that dynamic switching? Would love to try that out! Your HDR support in windowed mode in which version of DX is that? 12?
Cheers!

Hi @zeekindustries . Greetings from a fellow game developer. Have you considered the PA32UCX instead of the UCG? It has a bit of a green cast in HDR mode which you’ll have to manually get rid of through the limited adjustments that aren’t locked in the OSD when running the monitor in HDR but otherwise it’s great and it’s available now unlike the UCG. It unfortunate can’t use calibration profiles in HDR mode due to a firmware limitation. Maybe ASUS will fix it in the future.

I don’t know which game engine you use but if it’s Unreal or Unity they have this all figured out already (kind of; there is room for improvements). If it’s an in-house engine and you or someone from your company has questions about how we got HDR to work properly, it ain’t a company secret so feel free to ask.

1 Like