Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'm not sure the 2019 iMacs support HDR either...

I can't get an HDR signal to an external LG 27UK850 HDR monitor from my 2019 27" iMac 5K Retina (i9 + Vega 48 GPU) and think it's related to the comments in this thread. I've tried both USB-C and HDMI (with the new A2119 Digital display adapter). I get 4K @ 60HZ, but it's not an HDR signal. It didn't work with a beta of Catalina either.

I'm 99% sure it's the iMac—if I use the same cables/adapter/monitor with my iPad Pro, I do get an HDR signal.

Apple says this should work: https://support.apple.com/en-us/HT207806

Model A2119 is the newest model of this adapter. It supports HDMI 2.0 at these resolutions and refresh rates when used with macOS Mojave 10.14.6 or later or iOS 12.4 or later. It also supports HDR video in HDR10 and Dolby Vision formats when connected to a compatible playback device and display, TV, or projector.

  • Up to 1080p at 60Hz
  • 3840 x 2160 at 30Hz
  • 3840 x 2160 at 60Hz with these devices:
    • 15-inch MacBook Pro introduced in 2017 or later
    • Retina iMac introduced in 2017 or later
    • iMac Pro
    • iPad Pro
If I can't get this resolved, I'll probably return my 2019 iMac.

Why?
 
I'm not sure the 2019 iMacs support HDR either...

I can't get an HDR signal to an external LG 27UK850 HDR monitor from my 2019 27" iMac 5K Retina (i9 + Vega 48 GPU) and think it's related to the comments in this thread. I've tried both USB-C and HDMI (with the new A2119 Digital display adapter). I get 4K @ 60HZ, but it's not an HDR signal. It didn't work with a beta of Catalina either.

I'm 99% sure it's the iMac—if I use the same cables/adapter/monitor with my iPad Pro, I do get an HDR signal.

Apple says this should work: https://support.apple.com/en-us/HT207806

Model A2119 is the newest model of this adapter. It supports HDMI 2.0 at these resolutions and refresh rates when used with macOS Mojave 10.14.6 or later or iOS 12.4 or later. It also supports HDR video in HDR10 and Dolby Vision formats when connected to a compatible playback device and display, TV, or projector.

  • Up to 1080p at 60Hz
  • 3840 x 2160 at 30Hz
  • 3840 x 2160 at 60Hz with these devices:
    • 15-inch MacBook Pro introduced in 2017 or later
    • Retina iMac introduced in 2017 or later
    • iMac Pro
    • iPad Pro
If I can't get this resolved, I'll probably return my 2019 iMac.

Two updates on this issue:

1) I bought a USB-C to DisplayPort cable. Still no HDR signal from Mac OS.

2) I installed Windows 10 via Bootcamp on my 2019 iMac, and HDR worked on my external LG monitor. I successfully played 4K HDR YouTube videos. So now I'm guessing it has something to do with the drivers Apple is using with the Vega 48 GPU.
 
So at the end of the day the iMac display brings the bare minimum to HDR video from streaming content providers. It is an improvement over SDR but leaves a lot to be desired.
I see it exactly the same way : current iMac display is not capable of delivering any HDR output. As LCD its top brightness can just be classified as “bright SDR” and because of LCD its black level performance can not reach that of OLED.
PS as cynics said in the very beginning, HDR has nothing to do with wider color spaces. It is only about dynamic range (difference between the darkest and brightest pixels on screen).
Someone just thought that it will make a lot of sense to combine the rollout of wider DR with another improvement - wide color space.
The two together will result in a bigger color volume that the viewer will see.
 
I see it exactly the same way : current iMac display is not capable of delivering any HDR output. As LCD its top brightness can just be classified as “bright SDR” and because of LCD its black level performance can not reach that of OLED.
PS as cynics said in the very beginning, HDR has nothing to do with wider color spaces. It is only about dynamic range (difference between the darkest and brightest pixels on screen).
Someone just thought that it will make a lot of sense to combine the rollout of wider DR with another improvement - wide color space.
The two together will result in a bigger color volume that the viewer will see.


Well I was going to mention but refrained from doing so.... While DisplayHDR 400 is labeled as "significantly better than SDR" at displayhdr.org, there is a knowledgable minority of users that have a distain for that particular level and don't feel it should even be included in the DisplayHDR certifications. Its like Samsungs old HDR+...

My only problem with HDR displays is their implementation. Currently there isn't a way for the iMac to be good at what its designed to be good at and have a high level of support for HDR video without compromises that I don't feel are warranted.

I think until the technologies matures its not worth investing a lot of effort/money into (for computer displays) without a very good reason. The BEST consumer level HDR TV's on the market (displays designed for HDR without compromise for the productivity workloads) don't max out the specification.

To further that statement, when you compare a budget HDR10 TV to a flagship brand name HDR10 TV playing the same content its easy to see the difference in quality. And while its apples to oranges the iMac not only looks better than many HDR10 displays it will technically be better even at handling HDR10 content (with the right software).

We just need to be patient and keep in mind Microsoft will push this stuff out before Apple. How good or bad the hardware used for playback does doesn't carry anywhere near the amount of weight Apple will need to burden being the software AND hardware company. Although there is a obvious advantage there since hardware is a known quantity...
 
And while its apples to oranges the iMac not only looks better than many HDR10 displays it will technically be better even at handling HDR10 content (with the right software).
I know I've been spoilt with the HDR output of Sony ZD9 TV, but I am pretty sure all video decoding is being handled in hardware there.
I may be wrong, but I think the CPU is only responsible for Android TV UI drawing and ancillary functions (like HDMI-CEC for example).
 
Two updates on this issue:

1) I bought a USB-C to DisplayPort cable. Still no HDR signal from Mac OS.

2) I installed Windows 10 via Bootcamp on my 2019 iMac, and HDR worked on my external LG monitor. I successfully played 4K HDR YouTube videos. So now I'm guessing it has something to do with the drivers Apple is using with the Vega 48 GPU.

1. "HDR signal" just an HDR videos metadata as it applies to the specific HDR standard (HDR10/Dolby Vision/HLG/etc). Currently A/V players in Mojave (IINA, VLC, Quicktime) are using different techniques to narrow the color gamut to match the displays calibrated color gamut, white and black levels and luminance. The display itself isn't seeing the metadata so it wouldn't tell you its an HDR image (which it technically isn't) but its well beyond SDR rec709.

To maintain a working color space MacOS isn't going to emulate HDR10 system wide nor is switching between SDR and HDR a solution. Apples recommendation for HDR grading is an external HDR monitor calibrated for HDR contents color space (rec 2020). At big production levels you are doing that anyway with a reference monitor so this isn't anything new.

They will implement HDR support for external displays though it will just take time. OS wide tone mapping API's are being implemented into Catalina. Tone mapping is a technique for displaying an HDR on an SDR screen (reducing quality to match the display), however if implemented properly color can be mapped to the displays best specs providing the best image the display is capable of (not under or overshooting the mark).

2. Youtube in Safari uses a codec (VP9) that Apple does not natively support. And I don't think Google supports HDR in Chrome on MacOS (because Macs don't have HDR monitors). However Firefox does.....you'll need to open firefox type about:config and and set media.mediasource.webm.enabled to true.

Screen Shot 2019-09-15 at 9.27.13 AM.png
Screen Shot 2019-09-15 at 9.26.49 AM.png


On an external HDR it looks pretty good, you can tell its not just be down converted to SDR (maybe p3 space?). On a Mac display its "ok".

Software in MacOS does a decent job accurately playing videos with HDR spec. VLC has tone mapping built in, adjusting the desaturation a little bit and you can get a really good picture on an external HDR monitor in correct color space. Just don't expect it to say "HDR" on the monitor yet.

Random HDR video I downloaded.
Screen Shot 2019-09-15 at 4.52.30 PM.png
 
  • Like
Reactions: HDFan
I know I've been spoilt with the HDR output of Sony ZD9 TV, but I am pretty sure all video decoding is being handled in hardware there.
I may be wrong, but I think the CPU is only responsible for Android TV UI drawing and ancillary functions (like HDMI-CEC for example).

I meant the Mac. But I meant the software process for tone mapping, technically the GPU is doing the heavy lifting but there is much to help it in the way of acceleration.
 
I am testing Mac OS Catalina beta 7 on an iMac 5k 2019 and I can report that 4k HDR material is handled without any problem by Quicktime and this time with Tone Mapping !

And it is definitely hardware decoded.

I think the Intel Graphics Quick Sync included in the intel CPU is very well used here.

The main difference with Mac OS Mojave is that tone mapping is handled directly by Quicktime. Colors pop more and are beautiful at the maximum a 500 nit LCD display can deliver... Colors were washed out in comparison with Quicktime on Mac OS Mojave.

What is interesting is that my Macbook 2017 (although a Kaby Lake CPU) seems to need a software decoding to handle this 4k HDR material, whereas on the iMac 2019 5k, it seems to be hardware decoded only. The CPU doesn't even blink when you throw 4k HDR material at it...
 
  • Like
Reactions: Sooby
What is interesting is that my Macbook 2017 (although a Kaby Lake CPU) seems to need a software decoding to handle this 4k HDR material, whereas on the iMac 2019 5k, it seems to be hardware decoded only. The CPU doesn't even blink when you throw 4k HDR material at it...
I am not surprised at all. Contrary to what was posted earlier in this thread (corrected afterwards), the new iMacs 2019 are equipped with an iGPU named Intel HD Graphics CFL and I have seen what it can do in video decoding and encoding. It is very impressive and it makes a huge difference into anything relying on it. Its activity is also displayed in iStat Menus (App Store version) and Intel Power Gadget.
 
Colors were washed out in comparison with Quicktime on Mac OS Mojave.
That is because QT on Mojave does not know to use PQ EOTF that HDR requires. It still decodes with 2.2 gamma EOTF as used by Rec.709. Effect is similar to looking at Log footage on Rec.709 display.
 
This will be my last entry into this thread. What is amazing is that almost everyone has been correct almost all the time. I mean this as high praise for all the effort and work on this "puzzle" that is missing so many pieces.

I saw where "Priitv8" posted his /her / them's CPU specs on what has to be a 2019 27" iMac and totally blew up my theory that ALL 2019 iMac 27" i9's had zero onboard graphics. I am guessing again if they run out of the "KF" processor with no graphics they switch to the "K" with graphics. That though is one BIG crap shoot. The one I just returned was a "KF", "KF" is listed as the Apple processor used on this machine and also I saw where HP, Dell and others are using the "KF" processors also.
Now the part that is for me "out of my depth". Even if you have the "K" series the Apple we use still do not have HDR Displays. Sure they are 5k, 4k Retina eye watering displays they still have no HDR. That is High Dynamic Range of colors. Also and I could be very wrong but unless you can "trick" your 2019 i9 iMac to think that it is running on battery the computer will default to the Vega 48. The MacBook Pro goes to the intel GPU on battery for obvious reasons and I am not sure if it can be override. Last is the FACT that 4K UHD DRM :Digital Rights Management" only works as a whole if every component is 4K UHD DRM certified. That means the cables also. So first thing is to get an external monitor that fits all the specs. And so on....
Back to Apple. I am sure of one thing and that is they are not trying to mess up our wanting to play 4K UHD DRM media. The studios that made this crappy mess already. Microsoft OS computers all have the same compatability problems. The thing that they use is intel SGX tech and Apple will not go near it. It is Intels version of the "T2" chip. It handles security and handshake functions that help unlock the DRM side of the problem. It has been riddled with flaws and that might be a reason that Apple decided to not risk security over playing a 4K movie. I myself rather have the "T2" chip. I am sure they are working on it. It can be fixed through the GPU and as I said an external Monitor.
I do not care what anybody claims about what Windows 10 can do. IF any component connected to their Windows 10 computer system is not 4K HDR DRM compliant it will not work. When we run Bootcamp / Parallels / Windows 10 we may be seeing a movie, Tv show, we just are not seeing it properly.
Now I am going to just have fun doing all the things it can do. Wait a few months and see how Catalina effects these obstacles.

The one thing I can say about Apple is that I no longer have an extra bedroom full with all the crap that was "Windows compatible" that I could never get working!


Cheers!
 
The mystery deepens?

Today I updated my iMac Pro and my 2014 Mac mini. In the screen shot on the left, you'll see my iMac Pro has a new category in the TV app for "4K & HDR", while no such category exists for the mini (as expected). Interestingly, the mini does show applicable individual titles as being 4K/HDR.

My 2018 MacBook Pro displays the same "4K & HDR" category as the iMac Pro, but my 2017 MacBook behaves like the mini, i.e. no category.
 

Attachments

  • HDR.png
    HDR.png
    126.4 KB · Views: 322
Today I updated my iMac Pro and my 2014 Mac mini. In the screen shot on the left, you'll see my iMac Pro has a new category in the TV app for "4K & HDR"
Does this mean that 4K movies could be played on a recent iMac?
 
My iMac Pro and 13" MacBook Pro also show the Apple TV icons. I have not opened them yet. I have a week of tasks in front of that.
But...... Remember.......Beware
4K Digital Rights Media - HDCP 2.2 ie Copy Protection - HDR and this is the BIG ONE is why it is not on the top of my list. For these things to work on ANY format ( unless Apple found a way to strip the protection or made a way for our hardware to play 4K commercial content ) HDR only works on HDR enabled displays. None of the Apple displays are HDR capable. This is true of 95 plus percent of all computer displays. The shows / Movies that they are making they will be able to control the copy protection but not the shows they have contracted to show from outside studios.

NOTE: Mac and 1080p Blu-Ray and content work great.

My concern and this will take time is that nobody ( next to nobody ) can play a 4K commercial disk on any computer on any OS. The Blu ray drives have all been sabotaged by firmware not to read these disks. NOTE: Your standard LG / Asus blu ray disk drive can read 4K disks if you can get the firmware back to its original release. Read, not record/ i.e. write. By the way the " 4K Ultra BD Drives " are firmware locked also.
Then there is the problem of 4K Ultra Disks ( and most commercial media ) are encoded to only work on systems where every component is compatible with thier strict software / hardware design. I also have on my MacBook Pro because of the Intel onboard UHD graphics was able to play 4K Netflix. Problem was I got a good picture but it was not 4K or HDR because my screen does not support that. I do not have an external monitor that is 4K HDR HDCP2.2 HDMI 2 Dolby Vision HDR10 compliant. If even a cable or any connection is not a listed " compliant " the system will not work. In streaming I think that it defaults to the 1080p broadcast.

I have not tried it in the iMac Pro. All this is very time consuming and I have spent way to much time on it. I just wanted to make backup copies stored on hard drives. Edit "best of TV episodes." One day I will.
Also why bother playing 4K on these little screens? I have a Samsung 65" 4K HDR, HDMI 2, HDCP 2.2 etc... with matching 4K player and it can be very spectacular. Many times the studios are just putting the 1080p video on a 4K disk and charging $10 more for the same content. That means that they NEVER went back to the negative or master copy ( providing how good that is ) and create a superior product. If it was shot in 2K you can not make it 4k after the fact ( just in name only ) ...a sloppy example.
Do not forget about bandwidth. A 1080p content looks pretty good. It uses half or less bandwidth before the evil ( cable - phone - satellite ) internet providers start to "throttle down" our speed. OR Raise the price. Cable cutting is just a survival tactic for all the added utility fees we face now in our brave new world.

The machine that will be able to do everything we want ( with lots of upgrades ) is the new Mac Pro. The basic model will not do anything more than our iMac Pros. It will need Vega II graphics and who knows what else. I for one am not ever spending that kind of money for anything except a kidney transplant or world peace.

Cheers !!!
 
Does this mean that 4K movies could be played on a recent iMac?

I don't really know. I played back a few 4K + HDR movies in iTunes on my iMac Pro. I'm tempted to say it's only 1080p, but dynamic range from HDR did seem fairly bright.

I'm not in the habit of watching iTunes movies on my iMac Pro, so I can't give a comparison of the playback quality before vs. after Catalina.
 
The iMac Pro is a 5k Monitor but has ZERO HDR capabilities. HDR is high dynamic range of colors. Makes colors more "pronounced" for lack of better words. There is HDR10 supported by Samsung and HDR Dolby Vision supported by Sony, LG and more. Another Tech pissing match. The MacBook Pro models are not even 4k. They have great screens but not 4k or HDR of any "flavor".

My best guess is that the steaming service can tell what equipment it is streaming to and sends a compatible signal. if the video processor and or screen is good quality ( and Apple uses the best hardware that will work in their strict environment ) the picture will look great. There are utilities that will tell you what exactly your machine is doing with the software / streaming video.

I want to point out one thing about Apple. They are FAR FROM PERFECT. BUT... they really do care about privacy and security. Windows also cares but because they try to be all things to all people and there is no way to plug all the "holes" that thrive in that OS environment. Look at Linux Ubuntu. It is even more strict / safe that Apple OS. The "T2 Chip" is the beginning of Apples way if expanding what it can safely offer without compromising security. ID theft will ruin a persons life. I am a Apple customer after years of Windows OS going from bad to horrible. Window XP was as good as it ever got and it needed a ton of security software and money to be "safe". Constant compatibility problems / crashes / bloatware / pirate ware sneaking through, but all in all I was content. THEN WINDOWS 2000, VISTA ....Need I say more? I paid hundreds of dollars for failed OS systems and Microsoft never even offered a discount to all of us that were loyal and got burned.

Our machines and OS systems are in most cases very safe, the hardware has $$$ value as a "trade in" or at least recyclable so not to choke our planet with more plastic. They ( in my case last for years ) and I CAN TURN IT ON AND EVERYTHING JUST WORKS!

Cheers!!!

P.S. I am not some kind of Apple fanatic. Far From. I am just a sum of my experiences.
 
NOTE: Your standard LG / Asus blu ray disk drive can read 4K disks if you can get the firmware back to its original release.

There are instructions on how to do this on avsforum.com. You can also find individuals who will do it for you.
 
The iMac Pro is a 5k Monitor but has ZERO HDR capabilities. HDR is high dynamic range of colors. Makes colors more "pronounced" for lack of better words. There is HDR10 supported by Samsung and HDR Dolby Vision supported by Sony, LG and more. Another Tech pissing match. The MacBook Pro models are not even 4k. They have great screens but not 4k or HDR of any "flavor".

You are incorrect. The iMac Pro can indeed display DCI-P3 color gamut, which is what is used to master HDR content.


^^ you can easily see in the red "SRGB vs. P3" image the WebKit logo via the iMac Pro, whereas non-HDR displays (like my 2017 MacBook) look likely equally flat red backgrounds.
 
You are incorrect. The iMac Pro can indeed display DCI-P3 color gamut, which is what is used to master HDR content.
To be really a HDR display, there is more than color gamut. Enlarged color gamut has actually nothing to do with the Dynamic Range.
To classify as HDR display, a LCD panel also needs minimally 1000nit brightness, OLED can get away with 540nits.
This is what iMac 5K display is lacking.
It was just decided to market Wide Color Gamut and High Dynamic Range together so that resulting images are improved in both (brightness x gamut = color volume) both directions.
 
To be really a HDR display, there is more than color gamut. Enlarged color gamut has actually nothing to do with the Dynamic Range.
To classify as HDR display, a LCD panel also needs minimally 1000nit brightness, OLED can get away with 540nits.
This is what iMac 5K display is lacking.
It was just decided to market Wide Color Gamut and High Dynamic Range together so that resulting images are improved in both (brightness x gamut = color volume) both directions.

There is a difference between something classified as an official "HDR display" vs. a monitor that can display HDR content. The iMac Pro (and MacBook Pros) qualify as the latter.
 
I can confirm there is no 4K HDR streaming support on my 2017 iMac in Catalina. :(

To be really a HDR display, there is more than color gamut. Enlarged color gamut has actually nothing to do with the Dynamic Range.
To classify as HDR display, a LCD panel also needs minimally 1000nit brightness, OLED can get away with 540nits.
This is what iMac 5K display is lacking.
It was just decided to market Wide Color Gamut and High Dynamic Range together so that resulting images are improved in both (brightness x gamut = color volume) both directions.
Technically that’s true but practically it’s not. It doesn’t have to reach the full 1000 nits to be effective for HDR.
 
  • Like
Reactions: 2Stepfan
I can confirm there is no 4K HDR streaming support on my 2017 iMac in Catalina. :(
Anyone with a 2019 iMac 5K? I would do the testing myself but I do not plan updating to Catalina soon.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.