A Cinematographer's Take on HDR and ACES

Tags: #<Tag:0x00007f6331712a78> #<Tag:0x00007f6331712848>

A Cinematographer’s Take on HDR and ACES

By James Mathers
Cinematographer and Founder of the Digital Cinema Society

I’m no expert on the subject of HDR, but as a Cinematographer, I’m not sure I really need to be. To use an expression coined by my fellow Cinematographer friend, Bill Bennett, ASC, “we’ve been shooting HDR for years.” That’s because film and more recently high end Digital Cinema cameras have already been able to capture the necessary dynamic range. HDR is a display format rather than a capture format and the only problem is that the display technology has not previously been capable of properly showing HDR.

Now such technology is on the near horizon. It is estimated that there will be hundreds of titles available to stream in HDR by the end of the year with more on the way. I personally think it is a very engaging technology. Its more than just showing brighter highlights and a greater range between dark and light; it adds depth to the image, makes it more dimensional and immersive.

However, just as there are are huge benefits, there are also huge challenges. One is that although consumers are not yet experiencing much HDR in the home, they soon will be, and the images we capture today will likely be viewed in HDR in the not too distant future. However on-set HDR monitoring options are not always practical, so how do we evaluate what we’re capturing to know how it will eventually be displayed?

There are only handful of true professional grade HDR displays, and although they produce exquisite images, they are quite pricey and delicate, seeming more suited to the DI suite than the set. You also need a proper viewing environment which can be tough with the kind of narrative location based shooting I usually do, even though with HDR, a controlled viewing environment seems even more critical.

Another issue is that there is currently no single standard for HDR display or multi-platform distribution. The best approach for me seems to be to capture as if I were shooting film, where after testing through the post pipeline and familiarity with the camera, I know where I want things to fall.

This is where sticking to an ACES pipeline, the ability to test, and being able to supervise the DI is so crucial. Knowing that the color decisions I make will carry throughout the post and distribution pipeline will help insure all current and future audiences will see the images as I intended, whether viewing HDR or legacy displays.

Once HDR gets sorted out, it will be a boon to both creatives and viewers, but without a good framework of standards it could be chaos. In bringing together the divergent interest of filmmakers, manufacturers, studios, and service providers the Academy has done the Industry a great service getting everyone on the same page in regard to how we share digital images. Thank you for ACES!

1 Like

This ties in very well with something I posted on CML in response to a request for info on how to deal with HDR…

There is huge confusion between HDR at the exposure stage and HDR at the display stage, so a few basics.

What most of you know and think of as HDR is exposure side HDR where multiple exposures are used to create a combined image that squashes an HDR scene into a SDR display. This is the stills photography view of HDR and also one that some of us have experimented with in moving pictures. In my case with multiple Amiras in a mirror rig set to zero to give a combined 20+ stops DR.

What we are talking about in the case of moving images is HDR as a display format.

The UHD Premium standard is approximately 15 stops of DR peaking at 1000 nits, however, in most viewing environments this will not be the range you see, it1ll be more like 12 stops because of the effect of ambient light on the black level.

A rec 709 monitor will be capable of 6 to 8 stops of DR, the upper end is pushing it and I work on 7 stops as achievable.

So, in reality if we shoot with cameras with a DR capability of more than 12 stops we don1t have to worry about SDR or HDR as long as we expose in the middle of that range, iw mid grey should be exposed to give us +/- 6 stops.

If we expose like this we can use the entire image in either display format.

As cameras become available that give us more DR we will have more room to play with.

Bit depth, as when we finish for 8 bit displays it is best to start in 10 bit so that we have room to manoeuvre if we are finishing for a 10 bit display we need to originate and record at 12 bit or greater to give us room to work.

If you are shooting for a SDR finish now but wish to future proof what you shoot just expose to use the complete range of the camera and you’ll be cool.

Monitoring for HDR is difficult, partly because monitors that display HDR are bloody expensive and secondly you are highly unlikely to be able to view them in conditions that allow you to judge what is in the picture.

The solution to this is, wee, use a meter, become a cinematographer again, read shadows, read highlights, you then know if you’re in range.

ACES is a huge help in this process, I was demonstrating at HPA in February using the same rushes on various monitors in different modes. I had original material in raw form from Alexa, C500, F65 & Red Epic. Using both Resolve and Daylight I was feeding the original Dolby HDR monitor in both SDR and HDR and a Sony X300 in HDR. I also had a C
olorfront system with a Sony X300 showing HDR.

I was loading the rushes and not grading but just applying the relevant IDT and ODT and switching back and forth so that people could see what happened.

Essentially nothing happened! the HDR material looked better but it looked great without anything other than the relevant ODT being used.

The bigger DR made a huge difference but bear in mind that all this material had been shot for Rec 709 display, actually some was shot for DCDM but still DR limited compared to HDR.

I exposed it all as I normally would, using meters.

For the moment we have to use rec 709 monitoring and use our experience to know what will work.

Hmm, the cinematographer is the one who understands what the images will look like?