Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Been down this hell hole route before annoyingly.

My journey...

So any 100% pixel scaling looks like garbage today on anything I find. It's difficult going back and forth to that from any smartphone for example. So on the PC front, you can get a 27" 2560x1440 monitor. Pixels too damn big on anything and looks like crap on macOS. Fortunately that monitor blew up. So I bought a 24" Dell 2560x1440. That works ok-ish at 125% on windows i.e. enough pixel density not to kill your eyes too badly. My daughter runs that on her desktop PC now. So I bought a 27" 4k monitor, firstly a cheap Iiyama one just to see. Yeah works ok on windows at 150% but has sharpness control issues so it gets sent back. Buy a Dell P2723DE 27" 4k instead. Sharpness issue gone but still 150% scaling has a bunch of issues on windows. Sometimes things turn up on the screen at the wrong size due to the whole hack job that is high DPI on win32 apps. On the mac it's just a little blurry. So at that point I was using an M1 mini and that monitor and I bought an M1 Pro MacBook and I was like "oh ok that's what it's supposed to look like". Studio Display comes out and I winced and reluctantly just blew the cash on one.

So your options on the mac are really the following if you actually want something that isn't blurry or a piece of junk:

1. Get a 24" iMac. Seriously that's probably the cheapest option. It's a mac mini with a 24" 4.5k attached to it so it's spot on.
2. Get the Mini and a Studio Display
3. If you want to cheap out, get the Mini and an LG 5k display.
4. Just use a MacBook and be done with it.

Everything else is a horrible horrible compromise which I'm no longer willing to make.

I evolved slowly to a 16/512 M4 Mini and the Studio Display. It's not cheap but it's good. Really damn good.
While reading thru this thread I was thinking to myself how grateful I am to have my Studio Display>M2 Pro Mini and I did not have to replicate your journey. Lessons learned years ago if you're going to do Apple, keep it all Apple and save yourself the aggravation. I had that journey but in the Windows world building high end gaming PC as a hobby. My DD have been Macs since 2004.

Starting with a 2012 Mini>2014 mini, those worked fine with 1080p monitors but then Apl went retina. When I upgraded to a 2018 Mini along with a LG 24" 4K Ultrafine, the sad Intel 630 graphics did not have a chance at driving that display satisfactorily. Looked at cheaper eGPU options but bought the Black Magic 580 eGPU as it was Apl supported. Things like drivers, OS support etc. It ran the latest OS up until the Intel Mini was replaced with an M4 Mini which now drives the 24" 4K monitor with full Apl support and functionality. This is my brides desktop now.

I have an M2Pro mini>Studio Display and have enjoyed the combo since release.
 
Last edited:
  • Like
Reactions: cjsuk
Good for you! It’s likely because your 4K screens support HDMI 2.1 or another recent revision capable of delivering excellent performance. However, with non-retina displays using older HDMI circuitry, the difference in quality compared to DisplayPort can be quite noticeable.

Since switching to DisplayPort, I now enjoy a perfectly acceptable image. There’s ample information available highlighting DisplayPort’s superiority for computer displays, particularly with earlier iterations of both standards.

A key distinction lies in how they handle data: DisplayPort transmits data in packets, much like Ethernet, whereas HDMI sends data continuously. While the difference may be negligible for moving images, for still images —such as small fonts, menus, or UI elements— older HDMI connections often lack the clarity and stability provided by DisplayPort, which was specifically designed to support high-resolution computer monitors and demanding display applications.
That’s just not true. Yes, I believe you when you say that using DP solved your problems, but you don't understand why that happened, and it can be helpful to understand what's going on.

There is zero image quality difference between HDMI and DisplayPort. Period. It doesn’t matter whether you transmit uncompressed video in packets, or what the packet header size is or any of that. Besides that, in HDMI 2.1, the data is transmitted in packets. Transmitting in packets is great when you're sending a signal which is interleaved with, say, USB packets, right? But baseband video needs to be transmitted continuously. Even if you're sending a still image, you're transmitting that still image to the display 60 times a second, over and over. That's just how it works over a cable to a monitor.

What's really causing the difference can be many things. As some people said, Apple sometimes transmits to HDMI displays in YCbCr mode instead of RGB. This is often better for transmitting video to a television, particularly because digital video like H.264 compresses and decompresses into YCbCr, so you avoid converting to RGB and back again. And it's way better in a product like AppleTV, because it can mix the decompressed video with the graphic overlay all in YCbCr. But, YCbCr can be inferior at displaying black text against a white background; you can get artifacts around the text, and you really don't want to use YCbCr for computer productivity.

Another problem is that there are TONS of products on the market which purport to be "converters" which support HDMI 4K60, but what they don't say is they only support HDMI 4K60 4:2:0... so you get either 4K30 4:4:4 or you get really bad 4K60 4:2:0. And another problem is that most older and some current 4K60 monitors support DP 1.2 and HDMI 2.0, which means the DP link can support 4K60 10-bit, but HDMI only reaches 4K60 8-bit.
 
I used to force RGB mode on my M1 Mini, first using that old .plist trick... then later with the more reliable EDID override feature of BetterDisplay.

I now have an M4 Mac Mini, and none of these techniques work. I don't understand why Apple keeps making their HDMI output worse.
It seems there may no longer be any way to force RGB on the HDMI port. I guess I'll have to waste money and a TB4 port on a USB-C to DP adapter + DP to HDMI adapter just to get output that doesn't look like total washed out garbage.
 
I used to force RGB mode on my M1 Mini, first using that old .plist trick... then later with the more reliable EDID override feature of BetterDisplay.

I now have an M4 Mac Mini, and none of these techniques work. I don't understand why Apple keeps making their HDMI output worse.
It seems there may no longer be any way to force RGB on the HDMI port. I guess I'll have to waste money and a TB4 port on a USB-C to DP adapter + DP to HDMI adapter just to get output that doesn't look like total washed out garbage.
I agree; Apple should make it easy to force RGB. All HDMI displays support RGB.

Which display are you using?
 
  • Like
Reactions: Cape Dave
That’s just not true. Yes, I believe you when you say that using DP solved your problems, but you don't understand why that happened, and it can be helpful to understand what's going on.

For many older monitors, a DisplayPort connection is less problematic and consistently delivers better image quality on Apple Silicon Macs, even though HDMI theoretically offers the same potential. For example, my monitor defaults to 8-bit YPbPr over HDMI unless I use BetterDisplay to force RGB—yet even then, 10-bit remains unavailable. In contrast, DisplayPort provides RGB with 10-bit support by default.

While HDMI and DisplayPort can theoretically deliver the same image quality, this doesn’t hold true in every scenario.
 
In several threads, people have said the HDMI chipset is the same on 2018 Mini and the m1 Mini, maybe others. Lots of discussion of HDMI problems on the 2018 Mini, so I used USB-C to displayport when I setup mine. Later, I wanted one more USB-C port, so I tried a HDMI cable with my monitor (BenQ DesignView 32" 2560x1440). To my eyes, it looked exactly the same, maybe there would be a noticeable difference on a 4k screen? But there's a one-pixel wide green line at the right edge of the screen with HDMI. I didn't even notice this at first and it doesn't bother me, could probably crop it out in the settings.

But the second thing is weird - the 2018 Mini boots much faster with the display plugged into HDMI. And I mean, dramatically faster, like 15 to 30 seconds IIRC. It may be due to the fact that I also have 3 USB-C ssd's connected?
 
Omg... so the EDID method doesn't work anymore, but I didn't realize BetterDisplay now has a direct way to select color mode. Not sure why EDID override no longer works... but it doesn't matter. Just select RGB Full. Problem solved!

1733783716773.png
 
I used to force RGB mode on my M1 Mini, first using that old .plist trick... then later with the more reliable EDID override feature of BetterDisplay.

I now have an M4 Mac Mini, and none of these techniques work. I don't understand why Apple keeps making their HDMI output worse.
It seems there may no longer be any way to force RGB on the HDMI port. I guess I'll have to waste money and a TB4 port on a USB-C to DP adapter + DP to HDMI adapter just to get output that doesn't look like total washed out garbage.
I just setup a new M4 Mac Mini by restoring everything from an M2 Mac Mini backup. The plist hack that I configured on the M2 works on the M4 as well and I didn’t have to re-do anything.
 
Using my Dell U2713H monitor with my Mac Mini M4, it's clear that the USB-C to DisplayPort connection offers better color depth and avoids the drawbacks of using YPbPr, such as excessive saturation and sharpening.


USB3-DISPLAYPORT.png


HDMI.png
 
  • Like
Reactions: Michaelgtrusa
For many older monitors, a DisplayPort connection is less problematic and consistently delivers better image quality on Apple Silicon Macs, even though HDMI theoretically offers the same potential. For example, my monitor defaults to 8-bit YPbPr over HDMI unless I use BetterDisplay to force RGB—yet even then, 10-bit remains unavailable. In contrast, DisplayPort provides RGB with 10-bit support by default.

While HDMI and DisplayPort can theoretically deliver the same image quality, this doesn’t hold true in every scenario.
I realize that you don’t want to be wrong, but what I said is that the differences people are seeing between the two connections has nothing to do with the design or architecture of the two standards. DisplayPort does not and cannot provide an intrinsically better video signal.

At the same time, my monitor is an LG 4K60 with HDMI and DisplayPort, and I definitely get better results with DisplayPort.

Many of the primary contributors to the current DisplayPort and HDMI specifications are the same people.
 
A 1440p monitor displayed at its native resolution is perfect UI wise but obviously not as crisp due to the lower PPI. 27" is probably fine. 32" I imagine doesn't look particularly good.

yeah, I got a 27" Philips PHL 272B8Q (2560 x 1440) at work connected via HDMI to my M2 MBP 16". Looks & works totally ok. (it costs one 10th of the cost of an Apple Studio Display here in Sweden)
 
I realize that you don’t want to be wrong, but what I said is that the differences people are seeing between the two connections has nothing to do with the design or architecture of the two standards. DisplayPort does not and cannot provide an intrinsically better video signal.
I understand your perspective and appreciate the information you’re relying on. My view comes from direct personal experience, which sometimes reveals nuances that purely theoretical or generalized information might miss. It’s always interesting to see how theory and reality align—or don’t.
 
I understand your perspective and appreciate the information you’re relying on. My view comes from direct personal experience, which sometimes reveals nuances that purely theoretical or generalized information might miss. It’s always interesting to see how theory and reality align—or don’t.
You stated that HDMI video quality is inferior because it does not send packets. That is not correct.

Also, YPbPr is an analog video standard. HDMI and DisplayPort don't send analog video, they send digital video, as either RGB or YCbCr.
 
You stated that HDMI video quality is inferior because it does not send packets. That is not correct.

Also, YPbPr is an analog video standard. HDMI and DisplayPort don't send analog video, they send digital video, as either RGB or YCbCr.
You're absolutely right on both counts.

When I mentioned DP packets, my intent was to highlight the differences between HDMI and DisplayPort, not necessarily to attribute those differences to that aspect.
As for HDMI and DisplayPort both being digital, I’m fully aware of that. My mistake was confusing YPbPr with YCbCr—even after displaying the image with the options in BetterDisplay. :)
 
  • Like
Reactions: Alameda
@Alameda "There is zero image quality difference between HDMI and DisplayPort. Period."
"I realize that you don’t want to be wrong, but what I said is that the differences people are seeing between the two connections has nothing to do with the design or architecture of the two standards. DisplayPort does not and cannot provide an intrinsically better video signal."


However, outside of the theoretical sphere, in the real world of computer monitors available for sale, there are huge differences in
a) the signal path architecture of the video controller and T-con chips, often sourced from mass-producers like RealTek etc, and
b) the quality of the firmware which enables the differing signal paths for different video protocols. This firmware originates from the chip manufacturer reference design and is then reworked by the monitor manufacturer as they think fit...

This is further compounded by the continued reuse of older design paradigms which date from the era when DP 1.3/1.4 had an advantage over HDMI 1.4/2.0.
Mass market monitor chips that originated in the 2015-20 era were primarily designed for the DP protocol, and the HDMI capabilities came as more of an afterthought by tweaking the firmware.
And cheaper monitors are still using these controllers.

Or more expensive ones like the LG 27" TB 5K Mac-centric one.
RealTek did the basic reference design in 2014 when they announced its controller chip, 4K/144Hz at the time.
LG turned it into a 5K TB monitor, with spin-off expertise from their iMac 5K screen panel collaboration with Apple.
In 2019 LG added HDMI 2.0 4K as an afterthought, as Realtek had developed the firmware needed, and its still languishing today, more or, less unchanged for a decade.

So to get "zero difference... Period." means you have to stick to the brands that are committed to Display Quality (with higher cost), rather than the readily available mass market products.

By the by, Apple are also a guilty party to all this afterthought mindset IMHO...
 
Last edited:
@Alameda "There is zero image quality difference between HDMI and DisplayPort. Period."
"I realize that you don’t want to be wrong, but what I said is that the differences people are seeing between the two connections has nothing to do with the design or architecture of the two standards. DisplayPort does not and cannot provide an intrinsically better video signal."


However, outside of the theoretical sphere, in the real world of computer monitors available for sale, there are huge differences in
a) the signal path architecture of the video controller and T-con chips, often sourced from mass-producers like RealTek etc, and
b) the quality of the firmware which enables the differing signal paths for different video protocols. This firmware originates from the chip manufacturer reference design and is then reworked by the monitor manufacturer as they think fit...

This is further compounded by the continued reuse of older design paradigms which date from the era when DP 1.3/1.4 had an advantage over HDMI 1.4/2.0.
Mass market monitor chips that originated in the 2015-20 era were primarily designed for the DP protocol, and the HDMI capabilities came as more of an afterthought by tweaking the firmware.
And cheaper monitors are still using these controllers.

So to get "zero difference... Period." means you have to stick to the brands that are committed to Display Quality (with higher cost), rather than the readily available mass market products.

By the by, Apple are also a guilty party to all this afterthought mindset IMHO...
Yes, there are monitors which support higher resolution or bandwidth over their DP interface, but I don't think the signal path architecture causes differences because these two interfaces are so similar; they just receive uncompressed RGB or YCbCr and it's passed to the TCON without any buffering. If computer monitors buffered the video (as most televisions do), there would be too much latency. This is why HDMI added the ALLM mode, so that the transmitter can signal that it's a game and wants low latency.

But, it is important to understand that there are not quality differences between the two approaches; their payloads are identical.

Just as there are individual cases where a display delivers better performance with DisplayPort, there are many cases where HDMI performs better. In particular, HDMI products are typically far more robust at implementing audio and HDCP interoperability, implementing repeaters, and, of course, the audio return feature, which DisplayPort doesn't have at all. Even so, you're right that a lot of Mac display problems can be solved by using the DisplayPort cable instead of HDMI.

Apple now has HDMI 2.1 technology in the M4 and M4 Pro chips, and this will likely migrate into the other products, as time progresses. Whether Apple continues to prefer YCbCr over RGB is anyone's guess. I'm surprised they don't give users a way to force RGB.
 
@Alameda "There is zero image quality difference between HDMI and DisplayPort. Period."
"I realize that you don’t want to be wrong, but what I said is that the differences people are seeing between the two connections has nothing to do with the design or architecture of the two standards. DisplayPort does not and cannot provide an intrinsically better video signal."


However, outside of the theoretical sphere, in the real world of computer monitors available for sale, there are huge differences in
a) the signal path architecture of the video controller and T-con chips, often sourced from mass-producers like RealTek etc, and
b) the quality of the firmware which enables the differing signal paths for different video protocols. This firmware originates from the chip manufacturer reference design and is then reworked by the monitor manufacturer as they think fit...

This is further compounded by the continued reuse of older design paradigms which date from the era when DP 1.3/1.4 had an advantage over HDMI 1.4/2.0.
Mass market monitor chips that originated in the 2015-20 era were primarily designed for the DP protocol, and the HDMI capabilities came as more of an afterthought by tweaking the firmware.
And cheaper monitors are still using these controllers.

Or more expensive ones like the LG 27" TB 5K Mac-centric one.
RealTek did the basic reference design in 2014 when they announced its controller chip, 4K/144Hz at the time.
LG turned it into a 5K TB monitor, with spin-off expertise from their iMac 5K screen panel collaboration with Apple.
In 2019 LG added HDMI 2.0 4K as an afterthought, as Realtek had developed the firmware needed, and its still languishing today, more or, less unchanged for a decade.

So to get "zero difference... Period." means you have to stick to the brands that are committed to Display Quality (with higher cost), rather than the readily available mass market products.

By the by, Apple are also a guilty party to all this afterthought mindset IMHO...
While there are companies that cheap out by using e.g lower bandwidth HDMI 2.1 chips, often those still have equal or higher bandwidth to DP 1.4, so in practice the display capabilities over both port types are identical, and there is no perceived difference in image quality.

I can see discrepancies in performance between HDMI and Displayport on MacOS, but they are not in the actual image quality but what options MacOS supports. For example I recently ran into an issue where over the native HDMI port I cannot do 11:9 aspect ratio 2560x2160 scaled to "looks like 1920x1620" HiDPI on my Samsung G95NC superultrawide set to Picture by Picture mode. But the same situation works just fine over a USB-C to Displayport adapter.

To add insult to injury, if I change the display's PbP split to 2x 16:9 (3840x2160), the HiDPI scaling works as expected over both HDMI and DP. This points to MacOS having bugs or some pre-defined behavior when handling different resolutions, when it should have a generic solution that works regardless of resolution. It basically does not know what to do with the oddball 2560x2160 because it does not match any standard display resolution.

Windows 11 just seems to be way, way more robust when it comes to handling both port types. I have never experienced any performance differences there.

I thought when the M2 series added the HDMI 2.1 port that all this ******** would be solved.
 
Using my Dell U2713H monitor with my Mac Mini M4, it's clear that the USB-C to DisplayPort connection offers better color depth and avoids the drawbacks of using YPbPr, such as excessive saturation and sharpening.


View attachment 2460672

View attachment 2460673
If you are experiencing excessive saturation and sharpening, then the more likely cause is that over HDMI your display goes into a different picture setting. So check the display's own settings menu.

The only noticeable difference you might see at 10 vs 8-bit color is that gradients are smoother at 10-bit. But even that requires your display to actually have a 10-bit panel, which might not be the case as many use 8-bit+FRC.
 
If you are experiencing excessive saturation and sharpening, then the more likely cause is that over HDMI your display goes into a different picture setting. So check the display's own settings menu.

The only noticeable difference you might see at 10 vs 8-bit color is that gradients are smoother at 10-bit. But even that requires your display to actually have a 10-bit panel, which might not be the case as many use 8-bit+FRC.

Thanks for your interest. I checked the settings, and over HDMI my display defaults to YCbCr. The image quality appears over-sharpened, and the colors are overly saturated. When I switch to RGB via the on-screen menu, the colors shift to green and pink, though the over-sharpened edges disappear. Using BetterDisplay, I can force RGB, but the image still looks oversaturated and remains 8-bit however the over-sharpening is gone. Adjusting the color profile on the monitor itself doesn't seem to make a difference.
 
Omg... so the EDID method doesn't work anymore, but I didn't realize BetterDisplay now has a direct way to select color mode. Not sure why EDID override no longer works... but it doesn't matter. Just select RGB Full. Problem solved!

View attachment 2460506
This is really bad. The software either isn’t reading the EDID correctly, or it just makes bad assumptions.

For starters, it will only send the YCbCr in limited range. Thats going to make computer graphics look awful. It’s not a monitor limitation.

Then, the only 10-bit color mode drops the chroma down to 4:2:0. Maybe fine if you’re sending video only, but the EDID rules are very clear that if the device accepts a 444 signal in a given mode, and if it indicates YCbCr support, then you can transmit 422 as 8, 10, or 12-bit color. But none of those modes are offered.

So given what this software is offering, only the 8 bit RGB is going to be usable, unless you are playing pure video, like a Blu-ray.
 
Thanks for your interest. I checked the settings, and over HDMI my display defaults to YCbCr. The image quality appears over-sharpened, and the colors are overly saturated. When I switch to RGB via the on-screen menu, the colors shift to green and pink, though the over-sharpened edges disappear. Using BetterDisplay, I can force RGB, but the image still looks oversaturated and remains 8-bit however the over-sharpening is gone. Adjusting the color profile on the monitor itself doesn't seem to make a difference.
That's not normal at all. I'd look to see if that's a common issue with this display.
 
That's not normal at all. I'd look to see if that's a common issue with this display.
It’s not the display that defaults to YCbCr, it’s the Mac.

In an HDMI EDID, the device indicates whether it supports YCbCr, which is optional. RGB is mandatory.

Then the Mac decides whether to send RGB or YCbCr. The display doesn’t tell the Mac what to do, and RGB is always available in HDMI.

Same with this “Limited Color Range” feature. I don’t think the EDID even has an indicator for that mode, and for sure the EDID doesn’t limit the transmitting device from sending full range.

So I think 99% of these problems need to be chalked up to Apple, for sending limited range YCbCr video instead of full range RGB, and for failing to even give users a way to force RGB. Better yet, they could read the device’s ID over the EDID and determine whether it’s a TV or not, but at the least just provide a switch to choose between RGB and YCbCr.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.