On-set monitoring for HDR with ACES

Hello everyone,

We want to film a fiction with the RED Helium 8K in raw.
We would like to release the project in SDR but especially for the HDR and I wondered how I could monitor these two outputs on the set? What process do you recommend to work in ACES and release the film in HDR P3 D65 PQ 1000nits? Should I use a LUT for the HDR 1000nits? Where to find it?
Thank you for you precious help.

1 Like

Hi Elodie,

First of all I don’t have practical experience with the RED Helium, so my comments are purely about the workflow.

If I understand correctly you want to shoot in HDR and have the project released in both SDR and HDR?

About monitoring both outputs, having two displays on set, one HDR and one SDR will be the ideal solution.


Hello Jeremie,

Thank you for your help.
Yes, it’s all about that. I need both.
I was really wondering what LUT should I put in my HDR monitoring (P3 D65 PQ 1000nits) and SDR to monitor correctly to work in ACES thereafter.

Thank you so much,


Hi Elodie.

This is Bruno From Colofront.

To Monitor HDR and SDR On-set. You can use the AJA FS_HDR (uses Colorfront Engine Color Science). Colorfront Engine is based on ACES. Many shows have used this workflow already. This will allow to go from the camera SDI output (RedLog3g10/RedWideGamut) to SDR and HDR and use one HDR monitor set to REC2020,PQ @ 1000 nits. . Without going further into details, we can demonstrate here in santa monica, if you need.



Hi Bruno,

Thank you so much for your help. This seems to be a simple and effective solution indeed. Unfortunately I am in Europe. I will inquire with my resseller.
Have a nice day.

Hi! Could you tell me if this worksflow it’s useful with SonyVenice to be able to monitor HDR and SDR at the same time on set? Thanks a lot.

Either the solution @brunomunger proposes or using Pomfort LiveGrade (it can also talk to the AJA FS-HDR and use the colorfront engine) and one dual channel LUT box or ideally two separate LUT boxes (avoids lots of interpolation errors). Treat each signal path as an individual camera input.