Notebook - Sketches of Hue


Just throwing this into the mix. I’m sharing a Jupyter Notebook where I started to explore the idea of camera-referred “hue lines”. This is specific to post-IDT gamut mapping, and since the conversation has evolved since I initially created this it is now somewhat out of scope, but I’ll share anyway.

The basic idea I was starting to explore was how we could be handling non-colorimetric (failing Luther-Ives condition) out-of-gamut values. Where the notion was, since these values are non-physical and are clearly distortions/perturbations of a platonic colorimetric ideal, should we not keep this distortion in mind while mapping values back into a “sensible” range?

The notebook introduces a simple physical correlate to perceptual hue, and follows that through the IDT process.

I’m also totally acknowledging this is a bit hand-wavy, but wanted to share it to stir up the pot a bit :stuck_out_tongue:


Cool stuff @SeanCooper! I will take a better look at your notebook (hopefully before the next meeting).

Looking quickly and diagonally at it, I realise that maybe we should have an argument to generate the pulse waves in a way that is better for representing the hues in that definition:

It essentially does a similar job to your optimal_colour_stimuli definition but while generate_pulse_waves is built to generate the whole surface at once, the triangle base will prevent generating the hue lines like yours.

Typical flow for CIE xyY:

It gets very pretty with CIE XYZ! I will try to see if it can get an alternative flow in.



Great stuff, Sean.

I am not so sure we can infer psychophysical dimensions purely from spectral observation.

Especially the assumption that a box spectrum (or gaussian) which gets wider or thinner will elicit a constant hue perception. I see no reason to believe that this is happening in nature, so why would we tune in on that dimension? In nature, when we see the same object more or less pure in colour, it is because we change surface attributes like wetting the surface. Or dust on an object (mixture with unselective particles) will make it appear less pure in colour.

But the spectral toolbox is useful to elicit some physical fundamentals, for example, to define the boundary of matte, reflective colours and light-emitting colours. Here is a video where I use a similar approach to you to determine the boundaries of “natural colours”:

Skip over to 7:15min.

But your work shows nicely how things go through the roof if you throw in another observer :slight_smile:

@SeanCooper: I have Colabified your notebook for those willing to try without installing Colour & co:

I was willing to parameterise the camera selection but I saw that you are using a hardcoded IDT matrix from Raw to ACES, so I would probably need to implement that first cleanly once and for good.



Excellent stuff @SeanCooper, @daniele and @Thomas_Mansencal

This is exactly what some people postulate: Hamidreza Mirzaei, Brian Funt: A Robust Hue Descriptor. 21st Color and Imaging Conference

1 Like


From above:

This behavior of the Gaussian model at longer wavelengths can also account for the breakdown of the Gaussian predictions for the Abney effect at longer wavelengths, as noted by Mizokami et al. [16]. Again they found that the predicted hue shifts were much larger than observed at these wavelengths.

From above:

Here we have shown that a similar principle can potentially explain how the hue of a stimulus varies with purity. Paradoxically, the interactions between hue and saturation revealed by the Abney effect may actually reflect compensatory adjustments designed to maintain the hue of a stimulus when its purity is reduced by broadening the spectrum.


From above:

A central assumption of this account of the Abney effect is that the visual system can somehow compensate for its own filtering characteristics, so that changes in cone signals with variations in purity can be interpreted accordingly.

There has been some discussion in the meeting about the desirable need for hue linearity in the secondary and subsequent phases. The following paper shows the various discrepancies between the current body of data. There’s not a terrible amount of agreement, but Pridmore’s work in totality is extremely relevant here.

In particular, Pridmore identifies uniquely parallel hue invariant wavelength lines, and identifies the intervals between illuminants:


@SeanCooper: The feature/idt branch has quite a few things to compute IDTs, namely the colour.idt_matrix definition.



1 Like

Thanks. I’ll need to take a look.

1 Like