Aesthetic intent - human perception vs filmic reproduction

Thank you for starting some discussion on this topic! I think this is great. For me the best part about the ACES Virtual Working Group style of development is the open nature of it. Having development and discussion out in the open and public for anyone to participate and engage with is fantastic. I’ve learned so much over the last year being a part of this community it’s crazy.

Could not agree more. I’m actually surprised and a little saddened that the aptly named Naive Display Transform is the only prototype implementation that’s been contributed to the group so far. I know Christophe thinks it’s 1/3rd done, personally I think it’s <1/4 of the way to something that might even be considered for image making. As I’ve said a few times in my posts it is just an experiment to better understand the problems at hand when using a chromaticity-preserving display rendering approach. Given all of the knowledge in this group, I know we can find better solutions to the problems we are facing.

In other threads I am advocating for a chromaticity preserving approach, but I do not think it is the only valid approach. I think the reality in 2021 is that viewers are more used to seeing the look of per-channel rgb tonemapping than film.

I believe, based on the admittedly basic and naive examinations I’ve done, that most digital cinema display rendering transforms use a per-channel rgb approach, perhaps with some additional gamut mapping. Red IPP2, Arri Classic, Sony Venice. The differences between them seem to be the “rendering gamut”, and proprietary gamut mapping. It’s actually pretty interesting to look at the results of the Sony SGamut3CineSLog3_To_LC-709.cube lut on the gamut mapping virtual working group images. It handles all of these images really well without many hideous hue shifts and clipping. I’ll throw up some pictures here, since this particular display transform hasn’t really been included in the test images we’ve seen here before.



It’s all speculation of course, but I believe part of this is due to the wider rendering gamut of SGamut3.cine, and part of it is gamut mapping.


There’s a screenshot of SGamut3.Cine

I think it would be a useful (and very easy) experiment to put together a per-channel-rgb prototype as well, maybe using a wider rendering gamut like Filmlight E-Gamut, SGamut3.Cine, or DaVinci Wide Gamut. I think having this as a point of comparison would be useful.

I also think that the real challenge in this project and where the real complexity lies is in gamut mapping. At least for me, this is the next big topic I want to learn more about (yes, even after spending a lot of time on this already in 2020!).

3 Likes