Artificial test image lit by LED spectra

Thanks for the contribution of everyone. I created a synthetic test image and used it with Jed’s algorithm. I have bundled the images and a report in a zip archive which can be accessed here:

Here I present my summary and JPEG proxies of the Rec. 709 results.

An artificial test image based on the spectral emissions of 29 commercially available LED’s was generated. Colored LED lights have proven to be difficult cases for ACES.

The RGB response of the ALEXA camera and the original tristimulus values of the LED’s were encoded in ACES and transformed to Rec. 709 and P3 output. An RGB gamut mapping algorithm proposed by Jed Smith was applied.

The algorithm improves many of color clipping issues of the original pipeline. The parametrization of the algorithm is important.

Strong hue shifts remain in the ACES output transforms which need to addressed elsewhere.

This is the image using ARRI’s render transform.

This is the image as captured by ALEXA, transformered into ACES and rendered to Rec. 709

The same with Jed’s gamut mapping algorithm

The image as captured by the RCID and rendered to Rec. 709

The same with the GMA


Thanks @hbrendel. Those are really interesting findings. Might you be attending this week’s Virtual Working Group? It would be great if you could talk a bit about this during the meeting.

Yes, I intend to do so. I hope to be able to extend my tests for HDR on Thursday before the call

1 Like

Thanks to @Alexander_Forsythe the original ‘balloon test’ frames are up, in a folder titled (unsurprisingly) ‘balloon+2XL5-C_party_mode_test’. I have not had time as yet to do the test suggested by @matthias.scharfenber where a wide-spectrum illuminant is illuminating one side of the balloon with a constant level of illumination and an L5-C light in ‘party mode’ illuminates the other, with the terminators of the two illuminants overlapping. By the way, if you download the documentation for the L5-C light, you definitely won’t see ‘party mode’; there it’s ‘demo mode’:


There are two subfolders. ‘ocd’ is the original camera digital output, ARRIRAW wrapped in MXF; with

  • a short test take (A003C001)
  • a 5-minute permutation where the lights’ maximum intensities were set (as best I could empirically manage) so that no combination of the lights would clip in any channel (A003C002)
  • a second permutation with the lights’ maximum intensities turned up so that clipping might occur at certain times (A003C003)

Parallel to the ‘ocd’ folder is a ‘derived’ folder. This has a ‘clips’ folder (should be ‘sequences’, my bad) and within it are five folders each of which contains ACES 2065-4 compliant imagery, i.e. OpenEXR files with AP0 values. Three of these files are simple conversions from ARRIRAW to ACES 2065-4 containers; two of them, with the ‘_cropped’ suffix, are crops of those first three, but with only the balloon and a small bit of margin around it, to speed up any analysis code processing the image data.

Sorry, @nick, I haven’t gotten to posting the bits of Python that I wrote to do some minimal poking at the relative exposure values; the only stuff that was actually debugged was the code that determined in which octant the negative values drove the color. (I had remembered, perhaps incorrectly, something from @jedsmith where he said that pixels where all three components were negative were especially annoying. Such occur but with only 0.02% of the frequency of the most common case, where the green channel is negative and the other two are positive.)

Hope these are helpful.


For reference, if people want to review @joseph’s clips without downloading them first, here is a set of proxy MP4s with burned-in timecode:

(these are rendered with the Rec. 709 Output Transform and no gamut mapper)