SSTS in UE4

Hi everyone,

We’re interested in integrating the ACES SSTS transform into Unreal Engine 4 for our project (video game).
The reasoning being having a parametric tone curve that can be adapted based on display metadata retrieved from the system/console API as described in “Best Practice Recommendations for Game HDR Creation” (https://www.hgig.org/doc/ForBetterHDRGaming.pdf).
Currently, UE4 only implements the fixed 1000 and 2000 nits ODT which doesn’t allow any customization.

I had a first stab at implementing this first in a little hobby project to compare the SSTS transform to the fixed 2000 nits RRT & ODT transform UE4 uses and I noticed that using the default parameters from RRTODT.Academy.Rec2020_1000nits_15nits_ST2084 doesn’t exactly match the RRT+ODT_2000nits transform.
Is it supposed to match exactly? Mainly the min black is very different.
Here’s a capture from a plot I made trying to match the SSTS curve (Red) with the UE4 RRT+ODT (Green). I’m well aware I’m not following the “exact science” here, and I apologize as I’m far from an expert in this area.
I attempted to match the wonderful Python plot @sdyer made using the same axis mapping (ODT B-spline in python and Interactive Plot - #2 by Thomas_Mansencal)

I decided the curve looked “close enough” and decided to integrate it into UE4 and the result looks very desaturated and dim compared to the fixed 2000nits RRT & ODT which is surprising to me as the curve I made previously looked to match the 2000nits ODT pretty well.
I’ve converted the CTL code to HLSL and stripped it to the parts essential for the SSTS transform to work however I’m not entirely sure if the part after applying the SSTS mapping is required.

What I’ve done is return the “rgbPre” value directly after the SSTS because the EOTF in UE4 is applied outside this function already (ST.2084) and I wasn’t sure if any of the operations that come after were required.
I’ve put the code in a gist (UE4 - ACES SSTS · GitHub) all code until line 1161 is unmodified coming out of UE4, from there starts the integration of the SSTS transform.
I’d love to understand the mistakes I’ve made here as I’m very new to this and troubleshooting this is proving to be rather challenging.

Any advice or further reading resources would be hugely appreciated.

Thanks,
Simon

1 Like

Bear in mind that the rgbPre at that point is linear AP1 in absolute nits, with equal values representing ACES white. I am not familiar enough with UE4 to know what the display encoding portion of the code is expecting as input. Because AP1 is very similar to Rec.2020, simply applying an ST.2084 inverse EOTF which takes absolute nits as input should get you a pretty close match to the expected result. The actual Y_2_linCV function incorporates the black point, but since for HDR Output Transforms that is 0.0001 nits, its omission will not have a significant impact.

The actual 3x3 matrix from AP1 to Rec.2020 with Bradford chromatic adaptation to D65 is:

1.02582475 -0.02005319 -0.00577156
-0.00223437 1.00458650 -0.00235213
-0.00501335 -0.02529007 1.03030342

As you can see, it is pretty close to an identity matrix, so again its omission would not have a large impact.

So this is pure speculation, but the fact that your result is darker and less saturated than expected suggests that the UE4 display encoder is not just expecting Rec.2020 linear in absolute nits as input. Could it be expecting sRGB primaries? And is there a normalisation gain factor of some kind?

Hi Nick,
Thanks for your insightful reply, I really appreciate it.
So to give you the full picture of how UE4 uses this output transform, the input to the transform is scene-referred linear AP0 and after the SSTS transform is applied (rgbPre), it is converted to the output gamut. In my case that is Rec.2020, and applies an ST.2084/PQ transform.

float3 ACES_SSTS( float3 SceneReferredLinearsRGBColor )
{
    const float3x3 sRGB_2_AP0 = mul( XYZ_2_AP0_MAT, mul( D65_2_D60_CAT, sRGB_2_XYZ_MAT ) );
    float3 aces = mul( sRGB_2_AP0, SceneReferredLinearsRGBColor * 1.5 ); // Not sure what this "* 1.5" is for
    float3 ODTColor = RRTODT(aces); // Returns rgbPre
    // Convert from AP1 to specified output gamut
    ODTColor = mul( AP1_2_Rec2020, ODTColor );
    // Apply conversion to ST-2084 (Dolby PQ)
    OutDeviceColor = LinearToST2084( ODTColor );
}

I’m curious if the logic above is correct and that I’m potentially just looking at the wrong place and missed something elsewhere.

The actual Y_2_linCV function incorporates the black point

Do you mean that I’m incorrectly omitting this logic?

Thanks again,
Simon

I think somebody with more experience of UE4 needs to respond, I’m afraid. The 1.5 gain is clearly already diverging from the standard ACES rendering, so I don’t know what else may be done slightly differently too.

For a strict match you should include all the steps, including the Y_2_linCV. But it depends what your use case is. As I said, its effect is negligible.

We’ll have to reach out to Epic then to get some more context.
Thank you for your help, Nick.

It may also be worth noting that @sdyer’s interactive plot that you linked to was an early experiment on the path to the SSTS. Is it possible that changes between the experiment and the final SSTS are contributing to the difference you are seeing?

You’re right that it may explain the discrepancy.
I simply based my axis remapping on that interactive plot to somewhat validate my initial implementation. But the actual output transform is based on the final SSTS.

Hey Simon

Not sure if relevant to your specific situation, but you could use the OCIO implementation in UE4 which will let you use any ODT you might want in Unreal. It doesn’t let you choose and RRT however, but have a look at the docs, maybe it could bring you further in your quest.

docs . unrealengine . com /4.26/en-US/WorkingWithMedia/OpenColorIO/

Don’t you mean rgbPost? You should normally be returning rgbPost after applying ssts_f3 on rgbPre. Also, if you really want to match what UE4 does for the fixed 1.0 transforms, you should also add the inverse blue highlight fix it has in the middle before returning.

Hope that helps,
Jean-Michel

Apologies, yes you’re right I meant rgbPost.
And thanks for that tip.

I’ve managed to get it all working, turns out I looked over an implementation detail in the D3D12 RHI causing the display to expect linear RGB instead of BT.2084. Thank you @nick for that clue.
I appreciate all the responses here and I hope to be able to share the implementation for those interested once it is battle-tested.

Ah yeah. You have to implement both scRGB and ST.2084 codepaths indeed. There`s probably an option somewhere in UE4 to forbid it from using scRGB as it is bloody inefficient (64-bits framebuffer instead of 32-bits) but maybe not. Nvidia drivers do like it better though and will do one colourspace conversion less with it. On the AMD side, they prefer ST.2084 so pick your poison : whatever you do, it’s going to be sub-optimal for one type of GPU or another. In our custom engine, we only support ST.2084 because we implemented HDR for consoles first.

1 Like