ACES 2.0 CAM DRT Development

This came out on June 10… maybe a paper on ACES2 would be in order.

Note that comparisons are made to “ACES”, not noting which version.

Most likely ACES 1.X. I kinda recognize the patterns.

My tuppence,
Chris

The article says "Also, because ARRI’s earlier color space, AWG3, is larger than ACES AP0, original ALEXA footage sometimes clips in the blue region when mapped into ACES. " using these side-by-side images:

These images are taken from this article on Aces Central.

which says at the very top “ACES 1.3 introduces a new and improved solution to this issue with a Gamut Compression LMT.

That this is not mentioned in the article strikes me as a notable oversight.

Saying that makes it sound as if the IDT can clip AWG3 data, which is not true. Some AWG3 values will be transformed to have negative AP0 values, but they are not clipped at that point. The potential clipping, and resulting artefacts, occurs in the ACES 1.x Output Transform. And as @Derek and @ChrisBrejon point out, the RGC and ACES 2.0 renderings improve that situation.

I don’t think that article adds much to the original ARRI article it appears to be based on. And not all that it adds is strictly accurate.

Strictly what the Custom Colour Management gives you is the option NOT to use REVEAL processing, or at least not the DRT part of it, which is where the majority of the look comes from.

1 Like

Following on from discussion in the last meeting, I have been experimenting with code to procedurally generate Transform IDs.

Coding it forces you to think how the logic should work, and raises issues like how to deal with non-standard settings. It also made me wonder about whether some of the naming which is currently implicit should perhaps be more explicit. E.g. <ACEStransformID>urn:ampas:aces:transformId:v2.0:Output.Academy.P3D65.a2.v1</ACEStransformID> is implicitly 2.6 gamma. But should that be explicit in the ID?

Would it be possible to include an ODT for P3-D65 PQ @ 300 Nits? I have been grading projects for Apple Vision Pro and targetting 300 Nits. A 500 Nit ODT causes the highlights (the sun, etc) to blow out on AVP in PQ. 300 appears to be the ceiling. Resolve Color Management offers a 300 Nit output transform so that’s what I’ve been using but would like an ACES option as well.

Separate request - I also grade for Quest 3, which is SDR. It’s color space is P3-D65 but the gamma is 2.2 Would be great to have an ODT for this as well.

1 Like

Any dynamic range or combination of primaries/gamma is supported by the algorithm. I could make you CTLs for those outputs in just a few minutes. The real question is how you would get that into a format that allows you to use it in your tools. Most tools are still working on implementing 2.0 and so to use it now requires install of either LUTs or DCTLs (in Resolve specifically).

Resolve does allow custom ODTs to be installed so I would suggest that as a solution until built-in support for 2.0 is announced. In addition, even when the list of stock output presets are available as built in options, access to the parameters that could generate new transforms on-the-fly might not be accessible in the UI (it will depend on how thoroughly products implement the code). Therefore, the need for external crafting and then importing a transform might still be necessary.

I don’t currently have DCTLs to those outputs but I can generate them from the CTL. I will try to get them to you when I can.

2 Likes

The question is what does picture rendering mean in the case of an immersive headset. Is a DRT developed for images adequate for such a viewing condition?

Hi Scott, thank you for the quick reply. It would certainly be awesome to have that 300 Nit ODT as I do believe it to be the spec of the Apple Vision Pro and could be helpful to a lot of people. I’m familiar with loading ODTs manually into Resolve as I’ve been using ACES 2.0 candidates that way for the past year+.

That said, I’m using Mistika for my grading work as they have built-in native stereo grading tools. I believe SGO is in the process of implementing ACES 2.0 natively to Mistika Boutique and should be ready soon.

Maybe there’s still time to sneak in the 300 Nit ODT?

I’m not sure. I will say that anecdotally, I was happy-ish with the perceptual match between my Flanders XMP650 in PQ at D65 set manually to 300 Nits and what I saw in the Apple Vision Pro. It’s hard to compare apples to apples because if you’re staring at the Flanders, then quickly put on the headset, by the time you enter you pin and enter Evercast (what I was using to monitor HDR), it’s enough of a lag that you can’t really do a side-by-side. It’s almost like stepping out of the room and back again and trying to notice if the image shifted.

But we did complete the job and the agency / creative director was happy with the result.

These are the settings I used in Resolve I’m hoping to kind of mimic in an ACES ODT so I can use in Mistika (and so the VFX team can use in Nuke):

In my experience on this job, once we made the HDR master with these settings, we were able to then toggle the ODT to Rec709 Gamma 2.4 and make plausible SDR deliverables without the need for a trim (which wasn’t budgeted for).

My initial hope was to use Dolby Vision for the trim but apparently Dolby doesn’t offer a trim FROM 300 Nits, only from 1000, so we couldn’t use it.

It would be lovely at some point for someone smarter than me to study the visual experience in headset and make a bespoke ODT that also accounts for the lack of surround ambience, which is a bit of a sticking point on the Quest specifically, as it uses a Gamma 2.2 LED screen yet has 0 surround luminance. Obviously Gamma 2.2 is expecting a fair amount of surround luminance. But actually wearing a headset is more of a theatrical gamma 2.6 vibe.

In any case…back to HDR and the AVP - I think for now I can get happy clients and count on an OK SDR trim with an ACES 300 Nit PQ P3-D65 ODT. Adding a manual ODT Resolve is an OK option but not great for my use case as I am already happy with the RCM 300 Nit ODT and looking to work in Mistika (though GLSL custom code would work in Mistika, too).

Jeff

Was the format immersive video or spatial video, I guess this makes a difference.

It was immersive video. 180 stereo

Side thought. Makes me wonder how surround luminance is supposed to work with 180 immersive content, considering the content itself is effectively the environment. I suppose the FOV in most headsets is somewhat limited and theoretically gives you a 0 nit surround, and headsets can’t get anywhere near as bright as direct sunlight (such as watching something on a mobile device outdoors), but would be interesting to hear more research on this.

1 Like

I’m also doing an immersive video soon for the Apple Vision Pro so an ODT that will work for that would be very interesting.