Yes, I get that. I use a color-managed workflow for my photography hobby: calibrated screens, printer-paper profiles, 4700K halogen lamps, etc.
On the mini-LED display, HDR videos look different when I select the default preset, "Apple XDR Display (P3-1600)", or the "HDR Video (P3 ST-2084)" preset. The default allows adjustments of brightness (luminance), and a high setting clearly results in unnatural oversaturated colors. A low setting can look wrong, too. The "HDR Video" preset ensures that the color and luminance range parameters are correct, assuming that the content creator used standards. So the preset choices are indeed useful for consumers, at least in this case.
My questions are about the other presets. I don't understand why the sRGB preset is so dark. And I don't know which presets are appropriate, if I want to see a calibrated display of other media, or whether that is possible. I do realize that the visible differences might be slight, and that I can always adjust to taste. I'm not OCD about this, just curious.
For example, if I stream a movie on the Macbook Pro display, which preset would match the content? Is it possible to know (or guess) which standard was used in the production?
When I connect the Macbook Pro for streaming to our Sony A1E OLED TV, the Monitors preference panel shows another group of presets for that device. One of them is simply labeled "Sony TV", and the others are various industry standards, some of which are not available for the computer's own screen. There is also a checkbox to enable HDR on the TV. And, in my tests so far, it looks incredible.