Just throwing this into the mix. I’m sharing a Jupyter Notebook where I started to explore the idea of camera-referred “hue lines”. This is specific to post-IDT gamut mapping, and since the conversation has evolved since I initially created this it is now somewhat out of scope, but I’ll share anyway.
The basic idea I was starting to explore was how we could be handling non-colorimetric (failing Luther-Ives condition) out-of-gamut values. Where the notion was, since these values are non-physical and are clearly distortions/perturbations of a platonic colorimetric ideal, should we not keep this distortion in mind while mapping values back into a “sensible” range?
The notebook introduces a simple physical correlate to perceptual hue, and follows that through the IDT process.
I’m also totally acknowledging this is a bit hand-wavy, but wanted to share it to stir up the pot a bit