Gamut Mapping Part 2: Getting to the Display

@Thomas_Mansencal I have seen the effects of DRT change on cg asset look development. I agree that it is not a simple or well-understood problem, and it can cause serious headaches if the workflow is not understood and planned properly from start to finish.

My perspective here will be from feature film VFX - and from my personal experiences.

A conflict often arises in VFX studios when there is a desire for a consistent internal DRT in cg asset look development. An internal DRT has benefits for workflow, re-usability, and makes things easier for lookdev and texture. It can help a lot of things.

However, on different shows, the DRT from the client DI house is very rarely consistent. Every show has a customized display rendering transform.

At some point the cg asset has to be composited into a shot and sent to the client. If significant dialing of lookdev has happened under an internal DRT, and that asset gets rendered and comped there is significant danger for the appearance to change significantly - especially on more saturated colors like pyro, and more sensitive objects like human skin, as you mention.

So my question is: if a studio is using an internal DRT for cg asset work, don’t they already face the problem you mention? And as @daniele pointed out, wouldn’t a more chromaticity-accurate display rendering transform actually help solve some of these issues?

In the past, I have suggested workflows where initial cg asset work happens with the internal DRT, but at a certain point, evaluation of that work should happen under the show DRT, to avoid a big surprise when the work goes into comp.

Maybe cg asset look development work should even be checked under multiple DRTs in order to verify that assumptions are correct: an internal DRT, the show DRT, and a simple linear-to-linear display as you were talking about in your earlier post.

2 Likes