ACES 2 in Game Engines: Variable reference/max luminance

since 13.8 * 1.449 = ~20

Hmm. Again, I’m not sure that I want HDR to match mid grey of SDR exactly; this seems against the intended design of ACES. Instead, I want the experience of switching between SDR mode and HDR mode on a consumer TV using Windows to be similar to using a 100 nit SDR reference monitor and a (???) nit HDR reference monitor.

Are you saying that the system in fact displays PQ encoded material at the intended absolute nit value

Yes, I believe this is correct, in spite of the system displaying SDR content at a variable nit value based on “SDR reference white level”, which might be around 200 or so nits depending on the display that is connected to the computer.

and for displaying SDR with a peak brightness of 200 nits you, the developer, need to take the 100 nit SDR value, linearise it, double the linear value and then PQ encode it?

If I wanted to display SDR content in my HDR game, then yes I would need to take a [0, 1] range SDR value, multiply it by “SDR reference white level” to convert it to nits, and then PQ encode it.

So my goal is not to match SDR grey to HDR grey. Instead, because Windows produces a PQ signal for SDR content with a variable peak nit value (not fixed at 100 nits, but instead based on the display), I want my HDR content to similarly respect this variable “reference” nit value. I believe this would give me a similar final experience of comparing an SDR reference monitor that is fixed at 100 nits with an HDR reference monitor that is also fixed at a specific nit value.

Now that I’ve had the chance to discus this, it sounds like this goal is actually quite simple to achieve with ACES. I believe it would be as follows:

// Provided by the operating system as described here:
// https://learn.microsoft.com/en-us/windows/win32/api/wingdi/ns-wingdi-displayconfig_sdr_white_level
// Optionally controlled by the player through game settings.
sdr_reference_white_level = ??? // in nits

// Provided by the operating system as described here:
// https://learn.microsoft.com/en-us/windows/win32/api/dxgi1_6/ns-dxgi1_6-dxgi_output_desc1
// Optionally controlled by the player through game settings.
max_luminance = ??? // in nits

// This new constant simply represents that the video game uses
// Output.Academy.Rec709-D65_100nit_in_Rec709-D65_sRGB-Piecewise.ctl
// for SDR output.
aces_sdr_reference_white_level = 100 // in nits

// The following are configuration constants in the HDR ODT:
linear_scale_factor = sdr_reference_white_level / aces_sdr_reference_white_level
peakLuminance = max_luminance / linear_scale_factor

And, if I’m understanding correctly, this would give me a final range of [0, max_luminance] and the ACES behaviour between SDR and HDR would be correct, even though Windows automatically scales all SDR apps to sdr_reference_white_level when outputting HDR PQ.

Is this the recommended way to handle this behaviour in Windows and other operating systems that automatically scale SDR apps to higher-than-100-nits levels when outputting HDR PQ signals?