A Cinematographer's Take on HDR and ACES

This ties in very well with something I posted on CML in response to a request for info on how to deal with HDR…

There is huge confusion between HDR at the exposure stage and HDR at the display stage, so a few basics.

What most of you know and think of as HDR is exposure side HDR where multiple exposures are used to create a combined image that squashes an HDR scene into a SDR display. This is the stills photography view of HDR and also one that some of us have experimented with in moving pictures. In my case with multiple Amiras in a mirror rig set to zero to give a combined 20+ stops DR.

What we are talking about in the case of moving images is HDR as a display format.

The UHD Premium standard is approximately 15 stops of DR peaking at 1000 nits, however, in most viewing environments this will not be the range you see, it1ll be more like 12 stops because of the effect of ambient light on the black level.

A rec 709 monitor will be capable of 6 to 8 stops of DR, the upper end is pushing it and I work on 7 stops as achievable.

So, in reality if we shoot with cameras with a DR capability of more than 12 stops we don1t have to worry about SDR or HDR as long as we expose in the middle of that range, iw mid grey should be exposed to give us +/- 6 stops.

If we expose like this we can use the entire image in either display format.

As cameras become available that give us more DR we will have more room to play with.

Bit depth, as when we finish for 8 bit displays it is best to start in 10 bit so that we have room to manoeuvre if we are finishing for a 10 bit display we need to originate and record at 12 bit or greater to give us room to work.

If you are shooting for a SDR finish now but wish to future proof what you shoot just expose to use the complete range of the camera and you’ll be cool.

Monitoring for HDR is difficult, partly because monitors that display HDR are bloody expensive and secondly you are highly unlikely to be able to view them in conditions that allow you to judge what is in the picture.

The solution to this is, wee, use a meter, become a cinematographer again, read shadows, read highlights, you then know if you’re in range.

ACES is a huge help in this process, I was demonstrating at HPA in February using the same rushes on various monitors in different modes. I had original material in raw form from Alexa, C500, F65 & Red Epic. Using both Resolve and Daylight I was feeding the original Dolby HDR monitor in both SDR and HDR and a Sony X300 in HDR. I also had a C
olorfront system with a Sony X300 showing HDR.

I was loading the rushes and not grading but just applying the relevant IDT and ODT and switching back and forth so that people could see what happened.

Essentially nothing happened! the HDR material looked better but it looked great without anything other than the relevant ODT being used.

The bigger DR made a huge difference but bear in mind that all this material had been shot for Rec 709 display, actually some was shot for DCDM but still DR limited compared to HDR.

I exposed it all as I normally would, using meters.

For the moment we have to use rec 709 monitoring and use our experience to know what will work.

Hmm, the cinematographer is the one who understands what the images will look like?

4 Likes