I just reviewed the meeting recording where there was discussion around matching SDR middle grey and HDR middle grey. I understand the rational of having middle grey change based on max luminance and it sounds like the change is not too drastic, even when comparing 100 nit max luminance to 10,000 nit max luminance. I agree that detailing the expected behaviour with some example graphs would be helpful. The comments from @alexfry about why this is desired behaviour were good to hear and would be good to have written down.
During the meeting, I believe @alexfry asked if there any users who were actively finding this behaviour to be a problem. I actually made a post on a similar topic of using externally controlled reference luminance to dynamically match the apparent brightness of host operating system and its SDR content. I felt it was worth linking this thread here because it touches on some of the real-world cases where it is important to match HDR apparent mid-grey brightness to SDR apparent mid-grey brightness on an HDR display.
Common use cases would be games that are played in windowed mode on a desktop computer with an HDR monitor or mobile games where the user is switching back and forth between an HDR game and operating system/SDR apps, all of which should use the same reference luminance reported by the operating system. Using the same reference luminance should mean that middle grey ends up at approximately the same apparent brightness. (I don’t meant to say that it should be the exactly the same as per reasons discussed in the meeting.)