I don’t have a proper HDR and SDR suite to make comparisons between, but in theory in the “real world” that would only be true in the upper exposure ranges, right? In most cases the mid-grey point and even the overall picture level should be similar between SDR and HDR (this is what HLG was designed to preserve), in which case the “colorfulness” should also not be different (color gamut/range aside).
Unfortunately this may actually not be the case in proper grading suites where the SDR spec is 100 nits brightness and HDR diffuse white is around 200 nits (which was based on consumer/end-user studies). I haven’t read the papers, but ST.2084 PQ was originally based around 100 nits diffuse white (undoubtedly to coincide with SDR), but was later revised to 203 nits, likely to align with HLG and accepted consumer use. It’s possible, then, that you may see more of a difference between SDR and HDR in a proper grading suite than you would “in the real world.”
As a differing example, I would also say that DCI is not less colorful (perceptually, anyway) than Rec.709, despite having half the peak brightness (48 nits vs 100).
Just so I understand better, are you suggesting a user-accessible parameter for “maintaining intent?” What changes/is different between the two options?