Use same LUT for HDR/SDR

Hey everyone!

Just wanted to ask if anyone has any opinions, tips, workflows on how to approach this.
I am grading a footage that is coming in as ACEScc, I am doing the color grading for HDR and baking out a LUT.
This LUT is then applied to our ACES pipeline within a game engine, but then when comparing the HDR/SDR versions, the SDR will look very well blown out when we apply it.

So in a way I guess that this “HDR” LUT from davinci, will need to somewhat be reformatted or be reinterpreted as an SDR LUT isn’t it? In the case of course I want to use the same LUT file.

Thanks a lot ACES wizards!

Just a few tidbits.

Firstly, you should never receive ACEScc as it is not an interchange format. Data should be interchanged as ACES 2065-1 encoded data in an SMPTE 2065-4 container.

A gray card in the scene should have an ACES 2065-1 value of about 0.18, regardless of output device. HDR displays can cause the colorist to drive this value much higher. If they do the content may look blown out in SDR.

The ACES 1.3 transforms allow for the shifting of the output luminance of HDR source content on an HDR display. This will allow your colorist to get a brighter image without forcing the gray card to be much higher than 0.18 in ACES 2065-1. Then, when you switch to an SDR output transform. the content will look much more appropriate in SDR.

Sorry if I’m reading between the lines here a little too much, but this sounds like a familiar issue.

3 Likes

Hey Alex!

Thanks a lot for taking a look at this.
To give a little bit of context, this is a very particular setup as it is color grading the capture from a game engine.
The setup is a little bit like this:

  • The game runs in ACEScc and it’s then being fed to a deck link capture card.
  • In Davinci, the project is set to ACEScc/ST2084 1000 nits, and so, when we use Resolve Live, the feed from the card is already in what Resolve is expecting (we cannot change the IDT from the live feed in Resolve so that is why the game is already in ACEScc)
  • While running the game we do our CC for HDR and capture a still so that we can export a LUT out of it.

In our game we have an ACES/OCIO implementation, we feed in the LUT and apply a shaper so that we match what we did in Resolve.
And so far so good, the HDR output using the LUT has very minimal difference compared to the Resolve session, but, if we use the same LUT while setting the OCIO view as an sRGB/rec709 it will look very blown out.

Hope this helps to give a bit of a context of what’s going on. And yeah I’m aware that ACEScc should never be an interchange format, as it beats the purpose of having 2065-1 as the main container, but I guess I have not many options for this setup at the moment ._.

1 Like

ACEScc is floating point outside of 0-1 range so I would expect severe clipping of highlights if you try to capture it with Decklink SDI. Is your game engine clamping to 0-1 intentionally? It sounds like you might want to use ACESproxy for this.

1 Like

Just to add some context, most ungraded game content lies between 0 and 192 linear range (between 0…1 in AcesCc) after exposure. The important part of @zeekindustries 's workflow is that he’s feeding game content directly to Resolve while the game is running in Resolve Live mode so it’s either AcesCc or AcesCct (or a custom non-aces setup in DaVinci YRGB Color Managed mode). This allows AcesCc LUTs trimmed to the 0…1 range to work properly. As for using the same grading LUT in SDR and HDR, it’s a jungle and the one you do first will greatly affect the difficulty of this problem. However, since this is a workflow for video games, it is acceptable to hack it as bad as we want and to send a signal that is out of range. I’m going to send you a HLSL snippet as a starting debug point @zeekindustries

2 Likes