I own the folowing:
- iMac2017, 27", 5K Retina
- BenQ PD2700U 4K HDR monitor
- UsbC-DP 1.4 cable
I am trying to understand more about HDR on Mac as I am not sure I am getting it all right.
I selected the HDR checkbox in the Display Settings for the second monitor. I have no Saturation or Refresh setting but just Resolution. Monitor is set to HDR mode and can't change it from OSD (I think because he knows he is on an HDR source). If I use monitor's software called "Display Pilot" I can force the monitor to go on sRGB or Rec709 although (I think) Apple will continue to send out HDR signal. If I want I can change Display Settings un-checking HDR, then set monitor to sRGB and get it saturated exactly like the iMac screen.
Now.. my questions are...
1) Apple says iMac2017 is NOT HDR compatible but I believe they say so just because the internal 5K screen is not HDR, not because HDR can't be achieved through external monitors
2) Apple sells a 70$ dongle that is said to be compatible with iMac 2017. I contacted Apple and they told me that using that I will NOT get HDR so they say (but might be wrong) that I will get 4k and 60hz but not HDR. I think they are wrong as, I believe, since Catalina any Mac can have on external devices through proper cabling, HDR
3) HDR has (almost) nothing to do with graphic card. Sure, it needs to be powerful enough, but 10bit, Rec2020 color space etc... has nothin to do with the graphic card. It is more about available bandwidth on the output ports and correct cabling. Am I right here? So Thunderbolt 3 can do HDR10 as this requires "DP Alt Mode" achievable on TB3 with the right cable.
4) HDR receiving Rec709 signal will probably display washed out colours with little contrast. Am I right?
So what happens to me is this.
When I am just on my desktop (no particular apps) the HDR is washed out but the monitor says "HDR" (non "HDR emulated mode".. just "HDR"). What is going on here? I believe Apple is sending HDR signal (so the monitor correctly displays HDR) but the signal Apple (Big Sur) is creating is wrong. Is really sRGB and looks washed out.
If I open VLC and play some HDR file downloaded from the web.. it is playing ok. Colours are great and full of contrast. SO what is going on here? The app is overriding system settings? Other apps have similar problems. Da Vinci Resolve shows me a washed up image on the second monitor (used in full screen direct mode). This is somehow expected although I should be able in Rec2020 color space mode to reach full saturation on the second monitor (with, in that case, totally off-scale saturation in the iMac main monitor). Adobe applications (again.. full screen mode) will NOT open if the second monitor is on. I need to open the apps first and only then switch on the monitor.
So... is it Apple Big Sure really supporting HDR?
Or is it just passing out an HDR formatted signal but is really a sRGB or Rec709 ?
Am I missing something here?
Will something change when my new Mac mini M1 gets here?
Thanks
- iMac2017, 27", 5K Retina
- BenQ PD2700U 4K HDR monitor
- UsbC-DP 1.4 cable
I am trying to understand more about HDR on Mac as I am not sure I am getting it all right.
I selected the HDR checkbox in the Display Settings for the second monitor. I have no Saturation or Refresh setting but just Resolution. Monitor is set to HDR mode and can't change it from OSD (I think because he knows he is on an HDR source). If I use monitor's software called "Display Pilot" I can force the monitor to go on sRGB or Rec709 although (I think) Apple will continue to send out HDR signal. If I want I can change Display Settings un-checking HDR, then set monitor to sRGB and get it saturated exactly like the iMac screen.
Now.. my questions are...
1) Apple says iMac2017 is NOT HDR compatible but I believe they say so just because the internal 5K screen is not HDR, not because HDR can't be achieved through external monitors
2) Apple sells a 70$ dongle that is said to be compatible with iMac 2017. I contacted Apple and they told me that using that I will NOT get HDR so they say (but might be wrong) that I will get 4k and 60hz but not HDR. I think they are wrong as, I believe, since Catalina any Mac can have on external devices through proper cabling, HDR
3) HDR has (almost) nothing to do with graphic card. Sure, it needs to be powerful enough, but 10bit, Rec2020 color space etc... has nothin to do with the graphic card. It is more about available bandwidth on the output ports and correct cabling. Am I right here? So Thunderbolt 3 can do HDR10 as this requires "DP Alt Mode" achievable on TB3 with the right cable.
4) HDR receiving Rec709 signal will probably display washed out colours with little contrast. Am I right?
So what happens to me is this.
When I am just on my desktop (no particular apps) the HDR is washed out but the monitor says "HDR" (non "HDR emulated mode".. just "HDR"). What is going on here? I believe Apple is sending HDR signal (so the monitor correctly displays HDR) but the signal Apple (Big Sur) is creating is wrong. Is really sRGB and looks washed out.
If I open VLC and play some HDR file downloaded from the web.. it is playing ok. Colours are great and full of contrast. SO what is going on here? The app is overriding system settings? Other apps have similar problems. Da Vinci Resolve shows me a washed up image on the second monitor (used in full screen direct mode). This is somehow expected although I should be able in Rec2020 color space mode to reach full saturation on the second monitor (with, in that case, totally off-scale saturation in the iMac main monitor). Adobe applications (again.. full screen mode) will NOT open if the second monitor is on. I need to open the apps first and only then switch on the monitor.
So... is it Apple Big Sure really supporting HDR?
Or is it just passing out an HDR formatted signal but is really a sRGB or Rec709 ?
Am I missing something here?
Will something change when my new Mac mini M1 gets here?
Thanks