The future of working in VFX using HDR

Hey there, i have been looking at ways how in the future we can utilize HDR displays to enhance our work, as there are not yet any real standards regarding HDR displaying in Compositing/Lighting/texturing e.t.c We can only look at ways this could work, and Apple has gone forward proposing their sRGB-Extended standard.

After some research I concluded that pretty much all ‘HDR’ capable devices from Apple are actually just that, a extended sRGB , a sRGB curve that just keeps going above 1 . also called ‘sRGBf’ for float.
So when you playback PQ encoded content it looks like it is not in fact displaying absolute luminance’s like PQ would ‘like’ but its converting it from PQ EOTF back to this extended sRGB EOTF and thus into a relative system that is relative to your display brightness , colorsync on a ‘HDR’ capable device (or EDR how apple calls it) also shows the native display transfer curve as gamma 2.2.

How exactly this whole process works I am not sure, it does rather work well though. I also dont know how the reverse works internally, apparently when you grade something in FCPX and export PQ it still looks the same on PQ monitors, so there is some smart apple magic happening, if anyone knows more I really want to know.

This is already a Experimental feature in Nuke 12.2 but it already works with a buffer of 0-1.4 at least on my particular 2018 macbook Pro , I can see a difference between a constant set to 1 vs a constant set to 1.4 if my display brightness is not set to max, technically reading the documentation from apple on this it should be possible to go even higher, so when my display brightness is all the way down it should use all the leftover “NITS” to display values like 3 or 4 even depending on how bright the display goes.

I have to say I sort-of like this approach a bit better than dealing with PQ , for the future in a professional setting one could set the diffuse white of their display to 100NIT and then it would just display brighter depending on how much the display can do, I even see a benefit of this at just 400NIT. Should make it easier to judge the brightness of things vs staying in the heavy tonemapped world where everything is dull and squeezed down.

This would obviously require new methods of tonemapping and other strategies, what do you think how are we going to do this kind of work in the future? Is srgb-extended a way to go, will we have different tonemapping viewers depending on max display illumination? I can imagine that a 400NIT or 500NIT EDR tonemapper could work pretty well on current macbooks if foundry can increase the buffer past 2.
2 will display 200NIT peak if you set your sdr diffuse white to 100NIT

Currently without tonemapping, extendedsRGB is rather ‘useless’ of course, but it can show you the pontential, just watching a LogC image with extended turned on or off is a crazy difference, its still very much clipping but you really do so more dynamic range without tone-mapping.

This is also the first time the ‘HDR’ effect to me becomes really something usefull.

1 Like

I realise that I am replying to a very old topic and you may already have figured this out already but Apple extended sRGB is nothing new. It’s been available for years on Windows PCs where it’s known as scRGB. It is also worth noting that there is a fight between Nvidia and AMD over whether scRGB or native PQ-encoded Rec.2020 (a.k.a. HDR10) is the better hdr format and there is no victor there only losers (programmers and users) since software that don’t use the preferred format of the respective GPU vendor can cause double-conversion from the application to the GPU then from the GPU to the screen at scan-out. At Larian, we chose to support HDR10 for what it’s worth.

1 Like

This is a dumb idea.

For starters, all of Apple’s primary hardware is Display P3, and encoding to that ridiculous format scRGB is an added performance hit, with no upside.

It has been more or less depreciated in Android as well.

It was a dumb as dirt idea when it was created, and remains such.

Final note, Windows has never been colour managed, which makes the whole catastrophe absolutely ridiculous.

Yea I found some info on scRGB and a lot more on the whole concept of EDR and even talked to foundry briefly. its all in a damned state of stupid at the moment I completely agree…

that said I think for pure desktop useage and compositing the EDR concept isnt too bad but could just as well have used HLG if they wanted something relative and not absolute… sigh…

everytime something new comes along we all hope that people can agree on a standard but … lol

Make no mistake, EDR is fantastic. scRGB is absolutely idiotic, and should never have been integrated anywhere.

Leave it up to Windows developers to make a deeper mess of colour; the masters of engineering broken colour. Thank goodness scRGB never became a standard. A colossal failure at the design and conceptual level.

1 Like

I absolutely agree with you about scRGB. I guess it came about because there was no standard when Microsoft invented it back in 2006. The real questions are though : why are Nvidia throwing their massive weight behind it and why aren’t Microsoft supporting FP16 PQ in Windows? I don’t really mind using 10-bit per channel when using PQ curve since that is the HDR10 standard but it would be nice to have the option. I develop for multiple platforms including Windows btw :wink:

1 Like

hahah yea… btw does anyone know how windows is doing the SDR->HDR upconversion (for PQ/HDR10) , it looks damn good for what it is, I mean as in it looks like SDR , I mean of course there are a plethora of ways to do this just want to know for giggles and couldnt find much info about it, also about what exactly the sliders do.

Seems like its somewhat based on the BBC luts with a parametric diffuseWhite slider or something like that. I will get a aja FS-HDR this week so I can actually see those things in action… Espeically the whole secret sauce SDR to HLG conversion for broadcast as I really want to see how the commercials I master end up on someones home TV when watching a HDR broadcast (they just started with HLG Soccer here in germany on SKY and they show commercials, but there is no way to deliver HD Commercials to sky) and I am 99.99% sure they use the FS-HDR or the BBC conversion LUTS …

For the same reason I actually signed up for sky just to measure some signal levels on their broadcast stuff and see what they do I hope HDCP is not going to make my life difficult and I can actually push something from their set-top box thingy to some scopes …

As far I know, it doesn’t do anything fancy and just sends an scRGB buffer multiplied by a brightness factor to the GPU driver and let them deal with it. I could be wrong though because HDR10 is a basic requirement for a monitor to be recognized as HDR-compatible by Windows 10.

yea thats a interesting idea, hard to find sources for it but I am not a developer so I am probably looking at the wronng places :stuck_out_tongue:

Clueless.

There are so many ridiculous aspects to it:

  1. Horrifically inefficient encoding. The vast majority of the code values are invalid.
  2. Doesn’t cover the spectral locus, or even the HDR canonized BT.2020 primaries. This is hard to swallow given 1. above.
  3. Display P3 is the new standard if a cross survey of displays is taken into account. This encoding puts forward looking performance hits on anything but sRGB.
  4. Negative encodings are pure rubbish and obfuscating, especially in EDR approaches.
  5. Negative encodings do not work for compositing working spaces.

It’s just an epic failure of a design that is a wonderful blinking sign that any suggestion to use scRGB is a good indication that the group has no idea what they are doing.

1 Like

This, exactly this. What happens when you do multiplicative blending with negative encodings : s*t happens. This is why we don’t support scRGB in our game engine (alright I kind of made that decision by myself and I’m sticking to my guns [also: game consoles use PQ not scRGB]).

2 Likes