Notebook - Sketches of Hue

Hello,

Just throwing this into the mix. I’m sharing a Jupyter Notebook where I started to explore the idea of camera-referred “hue lines”. This is specific to post-IDT gamut mapping, and since the conversation has evolved since I initially created this it is now somewhat out of scope, but I’ll share anyway.

The basic idea I was starting to explore was how we could be handling non-colorimetric (failing Luther-Ives condition) out-of-gamut values. Where the notion was, since these values are non-physical and are clearly distortions/perturbations of a platonic colorimetric ideal, should we not keep this distortion in mind while mapping values back into a “sensible” range?

The notebook introduces a simple physical correlate to perceptual hue, and follows that through the IDT process.

I’m also totally acknowledging this is a bit hand-wavy, but wanted to share it to stir up the pot a bit :stuck_out_tongue:

4 Likes

Cool stuff @SeanCooper! I will take a better look at your notebook (hopefully before the next meeting).

Looking quickly and diagonally at it, I realise that maybe we should have an argument to generate the pulse waves in a way that is better for representing the hues in that definition: https://github.com/colour-science/colour/blob/c8acc31a8507dae1af81d7958490ac3428dc2728/colour/volume/spectrum.py#L57

It essentially does a similar job to your optimal_colour_stimuli definition but while generate_pulse_waves is built to generate the whole surface at once, the triangle base will prevent generating the hue lines like yours.

Typical flow for CIE xyY:

It gets very pretty with CIE XYZ! I will try to see if it can get an alternative flow in.

Cheers,

Thomas

Great stuff, Sean.

I am not so sure we can infer psychophysical dimensions purely from spectral observation.

Especially the assumption that a box spectrum (or gaussian) which gets wider or thinner will elicit a constant hue perception. I see no reason to believe that this is happening in nature, so why would we tune in on that dimension? In nature, when we see the same object more or less pure in colour, it is because we change surface attributes like wetting the surface. Or dust on an object (mixture with unselective particles) will make it appear less pure in colour.

But the spectral toolbox is useful to elicit some physical fundamentals, for example, to define the boundary of matte, reflective colours and light-emitting colours. Here is a video where I use a similar approach to you to determine the boundaries of “natural colours”:

Skip over to 7:15min.

But your work shows nicely how things go through the roof if you throw in another observer :slight_smile:

@SeanCooper: I have Colabified your notebook for those willing to try without installing Colour & co:

https://colab.research.google.com/drive/1LTZEVQWsSJTcKll4VqYnY93RoNpyEh1u

I was willing to parameterise the camera selection but I saw that you are using a hardcoded IDT matrix from Raw to ACES, so I would probably need to implement that first cleanly once and for good.

Cheers,

Thomas

Excellent stuff @SeanCooper, @daniele and @Thomas_Mansencal

This is exactly what some people postulate: Hamidreza Mirzaei, Brian Funt: A Robust Hue Descriptor. 21st Color and Imaging Conference

1 Like

@SeanCooper: The feature/idt branch has quite a few things to compute IDTs, namely the colour.idt_matrix definition.

Cheers,

Thomas

1 Like

Thanks. I’ll need to take a look.

1 Like

Better late than never, using the builtin colour.volume.XYZ_outer_surface/solid_RoschMacAdam definition: https://colab.research.google.com/drive/1NRcdXSCshivkwoU2nieCvC3y14fx1X4X#scrollTo=c5yjRGGGwtHP&line=1&uniqifier=1

3 Likes