Get a proper HDR preview that would match playout of the Rec2020 ST2084 1000NIT ODT to a reference monitor in a GUI app like affinity photo, probably when talking windows this would be ONLY be for affinity photo as thats pretty much the only App I found that can show HDR in the GUI with OCIO support for now.
Right now how Affinity seems to do it is to seemingly apply a scRGB or extended-sRGB ? transform to the scene-linear source values , this then gets sent to the monitor and gives you a HDR preview, might be wrong about that but it looks great.
Now what I would love to have is to match the look of the PQ aces ODT, or any other HDR transform like lets say ARRI reveal, so I can monitor in HDR with the creative DRT of my choosing in a GUI.
I know this is a weird edge case for regular VFX but What I am looking at is workflows to do graphics for HDR broadcast/ Live-TV productions, transitioning people from Photoshop to working in HDR.
Now what I think I need is to convert my acesCG to PQ using a display transform and then somehow reversing that display transform to whatever float scRGB values so that the HDR preview works again?
I think nuke on MacOS uses a similar approach with its float buffer hooking into apples EDR system.
would love to get some more info if someone knows how that could work
DirectX does support HDR10 format. You have to have an application that creates a window and DirectX swapchain in the proper format (RGB10A2 pixel format with Rec.2020 color space and ST.2084 EOTF). When that is done, Windows will display whatever is in that window as HDR10 data. There is a D3D12 sample in Microsoft’s Github but it can also be done with D3D11 (as long as you use Windows 10 Redstone 3 or later).
The reason Unreal engine uses a modified output transform that doesn’t apply the EOTF is because they have other operations in their post-processing pipeline that applies after tonemapping but before the EOTF.
I don’t like scRGB much. It requires 64-bit floating point render targets so it is twice as slow as HDR10, is specified using sRGB as a base with 1.0 representing 80 nits and the Rec.709 color primaries. Because of that, you always have to use >1.0 values (even for normal SDR brightness as nobody set their monitors to display a maximum of 80 nits) and you also have to use negative values in order to use colors that are outside the Rec.709 gamut.
Funnily enough, some Nvidia GPUs prefer this encoding to HDR10 encoding and might do a double-conversion when using HDR10 (decode PQ → re-encode to scRGB → GPU does internal processing → encode back to PQ because the monitor expects PQ). I have no idea if it is fixed on the latest Nvidia architectures. As far as I’m aware, AMD GPUs do not have this issue (I don’t know if they have the reverse issue with scRGB).
And yes, technically, 16 bits FP has better precision than 10 bits PQ but I still don’t like it for all the other reasons that I mentioned earlier
My two cents,
Jean-Michel
P.S.: As a video game developer, ReShade is a sensitive subject as we sometimes get bug reports from users that we can never reproduce and then we find out that they were using ReShade to hook inside the rendering of our game so we have to close their issues. This is unfortunately standard procedure when dealing with bugs potentially caused by foreign code that hooks into our own