Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Larabee119

Suspended
Sep 16, 2014
225
386
PS. I have no horse in this race. I have an M1 Pro and still only use one external monitor as a professional software developer. I wonder how many M1/M2 owners actually want two external monitors support? Would M1/M2 owners sacrifice CPU or GPU performance for two external monitor support?

I'm running M1 Max Macbook Pro 16in with three LG 27GP950 and also use the built in display. Never feel like I have enough screen. I wish the M1/M2 can support more monitors. That's the reason why I let my Mac mini M1 go. It's a beast of a machine for very little money.

My ideal setup is 6 monitors or 2 ultrawide but for now, no Apple silicon can support that many, so I have to use two M1 Max Macbook Pro to drive those. That's the cost of working 3 jobs.
 
  • Like
Reactions: WP31

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
I'm running M1 Max Macbook Pro 16in with three LG 27GP950 and also use the built in display. Never feel like I have enough screen. I wish the M1/M2 can support more monitors. That's the reason why I let my Mac mini M1 go. It's a beast of a machine for very little money.

My ideal setup is 6 monitors or 2 ultrawide but for now, no Apple silicon can support that many, so I have to use two M1 Max Macbook Pro to drive those. That's the cost of working 3 jobs.
What do you do that requires 3 external monitors?
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
I think the external display units rely on having 4 GPU cores to reliably drive one 6K 60Hz monitor (i.e.: one pro display XDR, which is apple's benchmark) and 3 minimum for the internal display controller or HDMI 2.0 interface. On machines that don't have a display built in, the hdmi 2.0 port gets tied to the internal display unit.

So M1: 1 ext disp controller and 1 internal controller & 7/8 GPU cores:
4 to external display and 3/4 for the internal/hdmi 2.0 display out.
That's not how any of this works. GPU cores don't directly interact with displays. They're just computation engines.

GPUs (and CPUs, you can use either or both for this) draw into frame buffers. A frame buffer is simply a region of RAM containing pixel data. It is often associated with a physical display, but doesn't have to be - you can draw into a buffer which is never sent to a display at all. In fact, in macOS there is generally a "frame buffer" per window, and then the contents of all these window buffers are composited into a display-sized frame buffer.

In Apple Silicon, the block responsible for reading a frame buffer and sending that data to a physical display is called the DCP. This is what Hector Martin tweeted about, and labeled as "DISP EXT" in his annotation of the die photo.
 
  • Like
Reactions: wyrdness and leman

leman

macrumors Core
Oct 14, 2008
19,521
19,675
As an example of what we're talking about: people have connected three LG UltraFine 5K displays and a HDMI display to a M1 Max. That's 8 DisplayPort signals. The DisplayPort ports can be seen in the ioreg separated from the displays.

But the two tiles (each a DisplayPort signal) of a dual tile display are mostly independent. Why can't they behave totally independently? As far as I know, each tile uses the same DisplayPort SST signal as a normal single tile display. Maybe there's some timing issue where they need to have the same resolution/timing and need to be in sync.

Now, if you take two displays and modify their EDID so they report themselves as two halves of a single display then they could probably work in macOS as a single display. Two independent displays won't care about having the same timing or being in sync with the other display as long they accept the timing.

I have to admit that display controllers have always been black magic to me. I have no idea how they work or what they actually do. I mean, scanout is easy enough to understand, but the interaction between the framebufer, the scaling engine, the display controller, the output bus... too many questions.

If I understand your correctly, what you are suggesting would be easy if all display engine did was scan a framebuffer region and output the video signal. Then one could arbitrarily split up its capabilities to drive display tiles. But are we sure that it all it does? There might be some subtle frame synchronisation pitfalls to consider. Also, how would it work with framebiffer compression, and what about retina scaling? Could the display controller be responsible for resampling? Apple takes micro-optimizations seriously, that's one of the secrets behind their energy efficiency, so I wouldn't be surprised if their display controllers are much more complex that we might guess. And maybe there is something in there that makes it impossible to treat tiles independently.
 

Pressure

macrumors 603
May 30, 2006
5,180
1,544
Denmark
M1/M2 Mac don't support 2 daisy changed 4K Thunderbolt displays because it can only support one display from Thunderbolt.
That's why the M1 and M2 only have USB4 and TB3 support.

USB4 only require a computer to support one display and TB3 require that one display to support 4K.

TB4 require the computer to support two 4K displays.
 

MajorFubar

macrumors 68020
Oct 27, 2021
2,174
3,825
Lancashire UK
I watch The Everyday Dad's YT channel and while he's overall very complimentary of the M2 Air he's pretty scathing that it still can't drive more than one external display and considers that a critical failing. While I agree with nearly all the conclusions he comes to on his oft very reasoned verdicts on new hardware, I've never been able to agree that being unable to support two external monitors is a critical failure on what is Apple's cheapest laptop line (Air).

This seems to be the same as those people complaining it struggles under the weight of carrying browsers open with 200 tabs and half their apps running at once: this computer is not for you; either change your expectations or don't buy the cheapest machine.

Thing is, the laptop can obviously already drive two monitors: the laptop screen + one external display. Maybe Apple could have placated the needs of many complainers by allowing it to support either two external monitors or the laptop screen + one external monitor. Disclaimer: I have zero understanding of the technicalities to even know if this would be possible, but two displays is two displays in my simplistic brain.
 
Last edited:
  • Like
Reactions: Tagbert

theluggage

macrumors G3
Jul 29, 2011
8,011
8,444
That's not how any of this works. GPU cores don't directly interact with displays. They're just computation engines.
OTOH there may be no direct correspondence between GPU cores and the number of displays but more displays mean more pixels to render to more frame buffers. Even running an extra 4k screen adds a lot of extra pixels, and many people run 4k screens in scaled mode (so everything is rendered internally at 5k and then downscaled by the GPU) - so if you want to run multiple high-res screens smoothy, and maybe render 3D or high-def video on them - you should maybe looking at a SoC with a larger GPU anyway.

I'm not necessarily suggesting that Apple disabled existing extra display support to protect us from laggy displays but - as several people pointed out - this decision was probably made when the M1/M2 was on the drawing board, and how a GPU would perform with that many displays could/should have been a factor in that decision.

If you just need lots of mainly-static text displays, there's always Displaylink etc.

Maybe Apple could have placated the needs of many complainers by allowing it to support either two external monitors or the laptop screen + one external monitor. Disclaimer: I have zero understanding of the technicalities to even know if this would be possible, but two displays is two displays in my simplistic brain.
I don't think you need to get too bogged down in technicalities to guess that "more features" = "more transistors, more space, more power consumption, more heat". There's a saying that "a good designer knows when to reject good ideas" - the M1/M2 design process probably involved 101 decisions along the lines of 'is this feature worth adding x more transistors and extra connections between modules".

The underlying problem here is that the M1/M2 "overperforms" for its primary target market of passively cooled ultraportables and tablets, but doesn't really have the connectivity or RAM capacity for some of the more demanding workflows that the raw processing grunt enables.
 
  • Like
Reactions: wyrdness

joevt

macrumors 604
Jun 21, 2012
6,966
4,259
Also, how would it work with framebiffer compression, and what about retina scaling?
framebuffer compression - you mean DSC? That would be handled by whatever produces the individual DisplayPort signals.

retina and scaling are two different things. Retina is just drawing everything twice as wide and tall. It is the difference between drawing a 5 pixel line and a 10 pixel line. No scaling there unless you're drawing an image but in that case there can be scaling of the image whether retina is used or not.

Scaling is something that happens when the framebuffer size differs from the output signal size. For example, a 1440p HiDPI mode is 5K and is scaled to 4K output for a 4K display. I don't know if it scales every frame or if it's scaled to a separate 4K frame buffer only when the original 5K framebuffer changes - I'm guessing the former.

Could the display controller be responsible for resampling? Apple takes micro-optimizations seriously, that's one of the secrets behind their energy efficiency, so I wouldn't be surprised if their display controllers are much more complex that we might guess. And maybe there is something in there that makes it impossible to treat tiles independently.
Something on the chip has to do the scaling.

That's why the M1 and M2 only have USB4 and TB3 support.

USB4 only require a computer to support one display and TB3 require that one display to support 4K.

TB4 require the computer to support two 4K displays.
Right. The M1 and M2 Thunderbolt ports do everything required by TB4 except two 4K displays. Their Thunderbolt ports can support two DisplayPort connections, but the second DisplayPort connection is only ever used by dual tile displays. It might be interesting to create a fake dual tile display 7680x2160 using two 4K displays, but macOS maybe doesn't handle arbitrary dual tile displays - the list of supported dual tile displays may be hard coded and adding a new tiled display may require adding a file to the System Displays Overrides folder that is not easily modifiable by the user.
 

Larabee119

Suspended
Sep 16, 2014
225
386
What do you do that requires 3 external monitors?
Nothing truly requires 3 monitors, just me trying to make my life easier.
Job 1: Lightroom / Capture one: one monitor for thumbnails, one for full screen editing, one to display slack, chat, emails and the color books
Job 2: stock trader, 3 monitors aren't even enough
Job 3: Logistic. Display tons of information, spreadsheet, routes, CRM, etc. Not enough screen real estate even with 3 external monitor and the laptop display.

And I do all 3 jobs during the day and night.
 
  • Like
Reactions: WP31

leman

macrumors Core
Oct 14, 2008
19,521
19,675
framebuffer compression - you mean DSC? That would be handled by whatever produces the individual DisplayPort signals.

No I mean the compression that the GPU/memory controller does when rendering content.

retina and scaling are two different things. Retina is just drawing everything twice as wide and tall. It is the difference between drawing a 5 pixel line and a 10 pixel line. No scaling there unless you're drawing an image but in that case there can be scaling of the image whether retina is used or not.

Retina rendering usually involves resampling of the rendered 2x2 image to match the native resolution of the screen. They don't always match, e.g. you could be using a scaled resolution. You can of course use the GPU to resample the texture, but knowing Apple they probably have a dedicated hardware solution for that.
 

joevt

macrumors 604
Jun 21, 2012
6,966
4,259
No I mean the compression that the GPU/memory controller does when rendering content.
Do you mean video compression / decompression?
compression is for creating a video file.
decompression is for outputting a video file to display.

Retina rendering usually involves resampling of the rendered 2x2 image to match the native resolution of the screen. They don't always match, e.g. you could be using a scaled resolution.
That's the scaling step. It happens for retina and non-retina modes.

The step before that (drawing to framebuffer) is where retina happens.

You can of course use the GPU to resample the texture, but knowing Apple they probably have a dedicated hardware solution for that.
GPU is hardware but I get what you mean. In either case, software needs to setup the scaling/resampling.
 

Tyler O'Bannon

macrumors 6502a
Nov 23, 2019
886
1,497
As an example of what we're talking about: people have connected three LG UltraFine 5K displays and a HDMI display to a M1 Max. That's 8 DisplayPort signals. The DisplayPort ports can be seen in the ioreg separated from the displays.

But the two tiles (each a DisplayPort signal) of a dual tile display are mostly independent. Why can't they behave totally independently? As far as I know, each tile uses the same DisplayPort SST signal as a normal single tile display. Maybe there's some timing issue where they need to have the same resolution/timing and need to be in sync.

Now, if you take two displays and modify their EDID so they report themselves as two halves of a single display then they could probably work in macOS as a single display. Two independent displays won't care about having the same timing or being in sync with the other display as long they accept the timing.

You could maybe do the same in Linux, but add another layer so the two halves appear as separate displays again.


A Thunderbolt port of an M1/M2 Mac can output two DisplayPort signals but the second DisplayPort signal can only be used by a dual tiled display such as the LG UltraFine 5K or the Dell UP2715K or the LG 5K2K display.
The Dell UP3218K is also a dual tile display but I don't think Apple allows 8K for Apple Silicon. Apple added support for 8K60 for the Dell UP3218K in macOS Ventura but only for Mac Pro 2019 (unless you use a patch for other Intel Macs).

M1/M2 Mac don't support 2 daisy changed 4K Thunderbolt displays because it can only support one display from Thunderbolt. You need M1 Pro or M1 Max for more displays, or use DisplayLink.

macOS doesn't support MST for multiple displays but does support MST for old 4K60 dual tile displays that used a stream for each 1920x2160 half of the display. Apple Silicon Macs don't support those old 4K60 dual tile displays.

macOS supports other features of MST though:
- convert fast and narrow DisplayPort to slow and wide DisplayPort. For example, the. CalDigit SOHO can convert HBR3 x2 with DSC to HBR2 x4 with or without DSC decompression. Except the CalDigit SOHO can't do 10bpc with DSC (which makes it unable to do 4K60 10bpc RGB) and macOS usually disables DSC except for Catalina where it was enabled by default or except for Apple's displays that support DSC (Apple Studio Display and Apple Pro Display XDR). Apple has a USB-C to HDMI 2.0 dongle that also can support HBR3 x2 with DSC to HBR2 x4 with HDMI 2.0 conversion but I think DSC is not enabled by default after Catalina.
- MST can mirror a DisplayPort signal to multiple displays.
Ahhhhh, thank you for the explanation.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
I have the impression we are talking past each other :D

Do you mean video compression / decompression?
compression is for creating a video file.
decompression is for outputting a video file to display.

Apple GPUs seamlessly compress framebuffer data to optimise memory bandwidth. When you are drawing anything using the GPU, the resulting image is compressed in the background. They also compress data buffers used for GPU compute etc.

That's the scaling step. It happens for retina and non-retina modes.

The step before that (drawing to framebuffer) is where retina happens.

Yeah, I am talking about matching the (retina) framebuffer data to the output resolution. There has to be some resampling, but is id done by the GPU or by the display controller?

GPU is hardware but I get what you mean. In either case, software needs to setup the scaling/resampling.

Why? And what exactly do you mean by "software"? They could have a shader that resamples the texture using the GPU texturing unit. Or they could have a dedicated hardware unit doing linear sampling in the display controller. I have no idea. But there are arguments for doing resampling in the display controller as it frees up the bandwidth and the expensive (in terms of power consumption) GPU time.
 

JPack

macrumors G5
Mar 27, 2017
13,542
26,164
Snapdragon 8c supports dual 4K external monitors. Does Qualcomm not care about power consumption?

It also has half the transistors compared to M2.

These guys are just working backwards to come up with an excuse for Apple.
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
Snapdragon 8c supports dual 4K external monitors. Does Qualcomm not care about power consumption?

It also has half the transistors compared to M2.

These guys are just working backwards to come up with an excuse for Apple.
Everything is a trade off. If they are using the transistor budget for 2 display controllers then they don’t have those transistors for something else. I personally have no use for ProRes but clearly Apple marketing thinks it’s a selling point.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Snapdragon 8c supports dual 4K external monitors. Does Qualcomm not care about power consumption?
The interesting thing is that the brand new Snapdragon 8 Gen 1 only mentions one external 4K display.
It also has half the transistors compared to M2.

And is what, 4x slower and only has a fraction of cache? That’s a weird comparison to make.
 
  • Like
Reactions: jdb8167

JPack

macrumors G5
Mar 27, 2017
13,542
26,164
The interesting thing is that the brand new Snapdragon 8 Gen 1 only mentions one external 4K display.


And is what, 4x slower and only has a fraction of cache? That’s a weird comparison to make.

Not really. Snapdragon 8 Gen 1 is a smartphone processor so it doesn't make sense to support more than 1 display. Snapdragon 8c and 8cx are for computing and support dual external displays.

Those processors have half the transistor count and offer about 65% the GB5 score compared to M2.
 

JPack

macrumors G5
Mar 27, 2017
13,542
26,164
Everything is a trade off. If they are using the transistor budget for 2 display controllers then they don’t have those transistors for something else. I personally have no use for ProRes but clearly Apple marketing thinks it’s a selling point.

Sure, but I think the question is, "Did Apple make the right trade off?"

With M2 vs. M1, there are 25% more transistors and floor area is 30% larger. Are there more people editing in ProRes rather driving two FHD monitors? Regular iPhone 13 can't even record ProRes.
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Nothing truly requires 3 monitors, just me trying to make my life easier.
Job 1: Lightroom / Capture one: one monitor for thumbnails, one for full screen editing, one to display slack, chat, emails and the color books
Job 2: stock trader, 3 monitors aren't even enough
Job 3: Logistic. Display tons of information, spreadsheet, routes, CRM, etc. Not enough screen real estate even with 3 external monitor and the laptop display.

And I do all 3 jobs during the day and night.
Wouldn't a Macbook Pro 14" be better for you?
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Not really. Snapdragon 8 Gen 1 is a smartphone processor so it doesn't make sense to support more than 1 display. Snapdragon 8c and 8cx are for computing and support dual external displays.

Those processors have half the transistor count and offer about 65% the GB5 score compared to M2.
One can argue that the M2 is an iPad Air SoC so it doesn't make sense to support more than one display. 🌝
 

JPack

macrumors G5
Mar 27, 2017
13,542
26,164
One can argue that the M2 is an iPad Air SoC so it doesn't make sense to support more than one display. 🌝

It would be like saying A12 is an Apple TV SoC so it doesn’t make sense to support more than 8GB RAM. 😐

Johny Srouji literally said M2 was “Second generation Apple Silicon designed specifically for the Mac.”
 

satcomer

Suspended
Feb 19, 2008
9,115
1,977
The Finger Lakes Region
This limitation is only on the Mac Book Air the lowest Mac laptop of the lineup and you want to run multiple displays with a base Mac that was made for basic users, kids first Mac laptop? No it's not Pro machine and never was a Pro laptop! So please stop complaining about a base Mac can support more the one display!
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Johny Srouji literally said M2 was “Second generation Apple Silicon designed specifically for the Mac.”
That's just marketing. It's literally a SoC that is designed for low-end Macs and iPads.

I think we can clearly piece together the reasons why the M2 only supports one external monitor:

  • Apple dedicates more transistors to display controller than Intel and AMD for efficiency reasons
  • Apple found that very few people used more than one external monitor on low-end Macs. (My guess is less than 1%)
  • Apple decided to use the die space for more powerful CPU/GPU/Accelerators than to support two external monitors, which very few people use on low-end Macs and no one uses on iPads

 

JPack

macrumors G5
Mar 27, 2017
13,542
26,164
That's just marketing. It's literally a SoC that is designed for low-end Macs and iPads.

I think we can clearly piece together the reasons why the M2 only supports one external monitor:

  • Apple dedicates more transistors to display controller than Intel and AMD for efficiency reasons
  • Apple found that very few people used more than one external monitor on low-end Macs. (My guess is less than 1%)
  • Apple decided to use the die space for more powerful CPU/GPU/Accelerators than to support two external monitors, which very few people use on low-end Macs and no one uses on iPads

That doesn't make any sense because MacBook Air 2018-2020 supports 2 external displays.

MacBook Air 2017 and older supported one external display even though the Intel CPU supports two. Clearly, Apple thought it was important to add the supporting hardware for dual display in 2018.

Nothing will ever get past the fact that Apple supported one external display, then two, then back to one.
 

joevt

macrumors 604
Jun 21, 2012
6,966
4,259
Apple GPUs seamlessly compress framebuffer data to optimise memory bandwidth. When you are drawing anything using the GPU, the resulting image is compressed in the background. They also compress data buffers used for GPU compute etc.
That's amazing.

Yeah, I am talking about matching the (retina) framebuffer data to the output resolution. There has to be some resampling, but is id done by the GPU or by the display controller?
I think GPU. There's a probably a transformation matrix to describe scaling and also rotation. A texture from the source framebuffer is transformed by the matrix to an output framebuffer. Or maybe there's a more efficient way to do this. The output framebuffer also needs to be setup to include black bars or pillars when the aspect ratio of the source doesn't match the aspect ratio of the output.

Why? And what exactly do you mean by "software"?
Software needs to setup the transformation matrix based on the selected display mode and rotation setting. GPU/whatever does the work.

They could have a shader that resamples the texture using the GPU texturing unit. Or they could have a dedicated hardware unit doing linear sampling in the display controller. I have no idea. But there are arguments for doing resampling in the display controller as it frees up the bandwidth and the expensive (in terms of power consumption) GPU time.
Yes.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.