Along these lines I found this blog post from Alex Fry:
he writes: “With normally exposed images, especially from live action cameras, you’re unlikely to see any values that render differently to the OCIO config, but with CG images containing extreme intensity and saturation you will see more pleasing roll off and desaturation.”
Unfortunately he has only implemented this with the P3D60 ODT. I would need it for an sRGB monitor. So I guess I will need to wait for someone (maybe Alex?) to write that.
I have tried to apply the matrix you shared with some footage shot on Red camera.
The footage have bright led that when transformed from dragon color 2 gamut to aces cg results in negative blue component values.
I assume your work was strongly related to Arri wide gamut.
Could you share your process to build the matrix to fix those kind of issue for other camera color science?
Let me know if you need more details ( unfortunately I wont be able to share the footage shot on RED at this point ).
I’ll try to give you some rgb values but it may takes a bit of time.
In the meantime, could you expand on your approach to come up with a matrix to fix this kind of issue?
To be honest, there wasn’t much science behind it rather than plotting the chromaticities of several offending values across a variety of problematic shots, then adjusting the primaries such that the problematic chromaticities fell into a more favorable portion of the rendering primaries.
As described here, this matrix just moves the effective rendering primaries.
Most of the problematic images that I designed with came from Alexa, but this issue exists with other cameras too. I suspect that your problematic values are falling outside of my effective rendering primary space. I’d be able to confirm this with actual values from your image that I could plot. Again, image context isn’t important to me, I just need to be able to calculate the chromaticities. When you get a chance, please send me at least one [R,G,B] triplet from the problematic area to better investigate.
Hello Derek, I am wondering if one year later there was a viable solution for this. The difference between ACES RRT V1.0 and the ACES OCIO still seem an issue. I am trying to start a new topic about this… Thanks!
Just did my first pass in Resolve 15 using ACES 1.0.3 and it seems that this problem is still present. Wasn’t it supposed to be fixed directly inside ACES with this version?
On another point concerning this:
I ran into a situation where the LMT did not fix the issue completely.
Here are a few specs about the footage…
Sony FS7, which is shot at higher frame rate and resized to HD (from 4K). The images are showing some heavy aliasing, artefacts most likely from all the compression, resizing and noise that is being produced.
A few examples, look at the lights on the left side of the frame: ORIGINAL LOG IMAGE:
Is there technical limitations to the type of footage we can use with ACES? Or, maybe this i caused by Resolve?
It would be great to have some feedback from @ACES.
Thanks!
UPDATE*
Just tried using @Paul_Dore’s DCTL workflow and the artefacts seem to be OK or close to how the LUT behaves. Should we consider this to be an inherent flaw of resolve?
Hm, certainly doesn’t look too pleasing in any of those cases…
Can you provide a crop of that region of interest that you highlighted so I can probe the image values around those lights and try to duplicate your issue on my end to see if this this a Resolve issue or an ACES issue?
Full frame DPX would be great since it would eliminate any potential issues from resizing. I just suggested a crop because sometimes sharing a full frame (even just for testing) is difficult without clearances.
What encoding does the Sony FS7 record as (i.e. what IDT did you use? e.g. SLog3/SGamut3? other?)
No, not necessary as long as I know what encoding is in the DPX file I get. If it’s SLog3-SGamut.Cine then that is the important information to know. Thanks!
A side but important note, the choice of the resizing kernel is important here as you may introduce even more artefacts if you use one with negative lobes such as Lanczos or Sinc when resizing down.
@Thomas_Mansencal: what I meant there is that the camera is 4K native and for “slo mo” it resizes to HD. Which introduces some heavy aliasing. I did no resizing after that.
On a side note: @nick, do you know which resizing algorithm Resolves uses for Sharper and smoother? We don’t have as many options as VFX artist have in Nuke (per se).
What @nick and I were discussing about too this morning is the fact that when the data is converted to Y’CbCr and sub-sampled it introduces a lot of cross-channel correlation and might make the situation even worse by introducing ringing artefacts around areas of hight contrast which seem to be the case here.
Many of the extreme values are in the shadows, which I suspect is related to quantisation (the FS7 is only 12-bit linear). The other out of gamut pixels are probably due to short cuts taken when recording high frame rate, down-sampling by pixel binning, recording in a Y’CbCr codec, and so forth.
It would still appear that Resolve 15 is not handling these out of AP0 gamut values correctly and introducing new artefacts not present in other ACES implementations.
Chiming in here to say THANK YOU THANK YOU THANK YOU. Just graded an entire feature (1500 shots, ugh), in ACEScct in Resolve 15. There are many instances of purple / magenta fringe pixelation, usually when the camera is getting flared out. The producer was freaking out and I was in the hot seat. Suddenly the grade became all about fixing these purple fringes, never mind that I made the movie look good or achieved pretty perfect shot matching. I was gearing up to give 1-2 full days of my life, for free, to qualify and desaturate each individual instance of this. I can’t believe you made a .dctl that magically fixes this. I applied and voila! Gone! Producer is happy. And I have 2 extra days of my life. Thank you so much.