Hey there, i have been looking at ways how in the future we can utilize HDR displays to enhance our work, as there are not yet any real standards regarding HDR displaying in Compositing/Lighting/texturing e.t.c We can only look at ways this could work, and Apple has gone forward proposing their sRGB-Extended standard.
After some research I concluded that pretty much all ‘HDR’ capable devices from Apple are actually just that, a extended sRGB , a sRGB curve that just keeps going above 1 . also called ‘sRGBf’ for float.
So when you playback PQ encoded content it looks like it is not in fact displaying absolute luminance’s like PQ would ‘like’ but its converting it from PQ EOTF back to this extended sRGB EOTF and thus into a relative system that is relative to your display brightness , colorsync on a ‘HDR’ capable device (or EDR how apple calls it) also shows the native display transfer curve as gamma 2.2.
How exactly this whole process works I am not sure, it does rather work well though. I also dont know how the reverse works internally, apparently when you grade something in FCPX and export PQ it still looks the same on PQ monitors, so there is some smart apple magic happening, if anyone knows more I really want to know.
This is already a Experimental feature in Nuke 12.2 but it already works with a buffer of 0-1.4 at least on my particular 2018 macbook Pro , I can see a difference between a constant set to 1 vs a constant set to 1.4 if my display brightness is not set to max, technically reading the documentation from apple on this it should be possible to go even higher, so when my display brightness is all the way down it should use all the leftover “NITS” to display values like 3 or 4 even depending on how bright the display goes.
I have to say I sort-of like this approach a bit better than dealing with PQ , for the future in a professional setting one could set the diffuse white of their display to 100NIT and then it would just display brighter depending on how much the display can do, I even see a benefit of this at just 400NIT. Should make it easier to judge the brightness of things vs staying in the heavy tonemapped world where everything is dull and squeezed down.
This would obviously require new methods of tonemapping and other strategies, what do you think how are we going to do this kind of work in the future? Is srgb-extended a way to go, will we have different tonemapping viewers depending on max display illumination? I can imagine that a 400NIT or 500NIT EDR tonemapper could work pretty well on current macbooks if foundry can increase the buffer past 2.
2 will display 200NIT peak if you set your sdr diffuse white to 100NIT
Currently without tonemapping, extendedsRGB is rather ‘useless’ of course, but it can show you the pontential, just watching a LogC image with extended turned on or off is a crazy difference, its still very much clipping but you really do so more dynamic range without tone-mapping.
This is also the first time the ‘HDR’ effect to me becomes really something usefull.