Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

GMShadow

macrumors 68020
Jun 8, 2021
2,122
8,655
Snapdragon 8c and 8cx support dual 4K external. X Elite will support dual 5K.

Current Intel mobile chips support single 8K60 or quad 4K60.

Let's say Qualcomm and Intel display controllers use more power and M3 is extremely low. This feels like a butterfly keyboard situation where M3 is unreasonably skewed towards power consumption. The feature mix feels wrong. Most MacBook Air consumers aren't going to spend $3,000 for a single 6K monitor or use it to run games with ray tracing.
If we’re being honest, most MBA customers aren’t connecting it to an external monitor at all.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
Why should power consumption be less relevant for external displays? Saving power is saving power. E.g. my M2MBP drives three 4K external displays plus the internal display whenever I am working. Every bit of added efficiency IMO is a good thing.

Because you are usually connected to power when running external displays. Especially since most displays nowadays are also power supplies. Note that power savings we are talking about is merely .5-1.5 watts - a huge difference running on battery, but not a lot compared to peak system power draw.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
If we’re being honest, most MBA customers aren’t connecting it to an external monitor at all.
True. But once you become someone whose work does require an external (e.g., someone who uses spreadsheets), it's not uncommon to need more than one. And that fraction of external display users is precluded from getting the Air, even if otherwise meets their needs.
 
  • Like
Reactions: MRMSFC

Chancha

macrumors 68020
Mar 19, 2014
2,307
2,134
True. But once you become someone whose work does require an external (e.g., someone who uses spreadsheets), it's not uncommon to need more than one. And that fraction of external display users is precluded from getting the Air, even if otherwise meets their needs.
The keyword being "fraction". Also remember the same chip also goes to the iPad, where you are not going to be able to attach a 2nd external display without a TB dock, even if the chip allowed you to. Apple may be guilty in many things, but limiting the basic M chip to 2 display buffer is about the closest to being a right call, marketing wise.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
The keyword being "fraction". Also remember the same chip also goes to the iPad, where you are not going to be able to attach a 2nd external display without a TB dock, even if the chip allowed you to. Apple may be guilty in many things, but limiting the basic M chip to 2 display buffer is about the closest to being a right call, marketing wise.
Well, fractions can be large ;).

I'm wondering about business use. I recall one poster who said his boss was thinking of converting to Macs when AS was released, but their budget only allowed for Airs, which would have worked great, except all their workstations use dual monitors. And dual monitors is pretty common in many offices these days.

Note also that the M-chips only go into the iPad Air and iPad Pro. The iPad and iPad Mini use A-chips.
 
Last edited:

Pressure

macrumors 603
May 30, 2006
5,178
1,544
Denmark
Looking more closely at the die-shots, and there is another mystery in them: The base M3 only has one display controller, despite having to support two displays (one internal + one external). Furthermore, the M3 Pro only has two controllers and can support three displays (internal + two external), and the M3 Max has four display controllers for five displays (four external + internal).

Assuming these are annotated correctly (and they might not be), it appears that some of these display controllers might in fact be able to drive more than one display, but not all of them are. It's unclear whether there are actually differences between the display controllers, or whether this is an artificial limitation.

We can't necessarily jump to conclusions that it's an artificial limitation either, since Apple still is adding one additional display controller for each additional display after the standard 2 displays (one internal + one external) from the first display controller. It's more likely that the internal display itself is driven by another separate display controller somewhere on the die that isn't annotated.
That's not true though.

He clearly mentions in the video that the area contains two display engines despite only labelling it as a (singular) display engine.

The revised annotated shot of the base M3 looks like this from his video.

M3die.jpg


 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
By the way, what really shocks me is the size of the thunderbolt controllers. They are crazy large.
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
How much RAM can they realistically have included? Could it be embedded DRAM instead of SRAM to save space? A 6K frame buffer is around 60MB. If they have 1MB they'd have to wake the system RAM 60*60 times per second. Is that worth it? Though if you're running a 6K display you're probably not on battery power.
It should be possible to make a rough estimate based on comparing memory in the display engines with other memories of known size (such as CPU caches). (Not me though, I don't have the time and also I know how error prone simplistic area estimates can be. In many cases, designers choose different SRAM cells for different memory arrays, based on other requirements, and this means density (bits/mm^2) can change quite a bit between two different memories on the same chip.)

It might be more than 1MB. Those things are huge - each display controller uses the same die area as two P CPU cores.

I think it's unlikely to be eDRAM. That has mostly been an IBM technology. There's probably eDRAM macros available for some TSMC process nodes, but eDRAM is costly. You always need extra process steps, and sometimes they're a bit exotic. IBM's eDRAM is based on etching deep trench capacitors below the "base layer" (where the transistors live), while other kinds of eDRAM build the capacitor in extra metal layers.
 
  • Like
Reactions: Analog Kid

EmotionalSnow

macrumors 6502
Nov 1, 2019
369
1,351
Linz, Austria
Because you are usually connected to power when running external displays. Especially since most displays nowadays are also power supplies. Note that power savings we are talking about is merely .5-1.5 watts - a huge difference running on battery, but not a lot compared to peak system power draw.
1) less power draw while charging means faster charging
2) less power draw in general means less $$$ wasted and less carbon spit into the environment

and honestly, I'd even challenge the assumption that you're usually connected to power when connected to an external display — lots of people aren't. nobody wants their MacBook to waste 40% of battery during a presentation, or maybe even die; and you don't always have the ability to charge during one.
 

Nugat Trailers

macrumors 6502
Dec 23, 2021
297
576
Well, 0.5 to 1.5 watts, given Apple boasts about the best performance per watt, they'd probably want to shave off everything that's not needed.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
All fair points. But I think the bottomline is that most of this stuff can should probably be split of on a separate 5N die — would reduce costs and/or free up the space for more interesting stuff, such as more CPU and GPU cores.
 
  • Like
  • Sad
Reactions: altaic and Gudi

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
Well, 0.5 to 1.5 watts, given Apple boasts about the best performance per watt, they'd probably want to shave off everything that's not needed.
Nobody else does it. Why? Because nobody else uses the same design for phones and computers. This extra power consumption, when connected to multiple external monitors, is just a rounding error compared to the monitors' power consumption. Besides people who want to be pedantic about the goodness of lower power consumption in general then have to account for the extra power spent on extra silicon during the manufacturing of the larger chips.
 

dgdosen

macrumors 68030
Dec 13, 2003
2,817
1,463
Seattle
That's not true though.

He clearly mentions in the video that the area contains two display engines despite only labelling it as a (singular) display engine.

The revised annotated shot of the base M3 looks like this from his video.

View attachment 2315583

The thing that I'm left wondering about from this video is the mention of "black silicon" or the "seemingly unused" portions of silicon on the chip. That seems expensive in a rising-wafer-cost world.

Does anyone have an idea there? In my layman's thinking, I have an idea that if a company ever started stacking dies (vs using chiplets), they'd need some kind of design for an elevator shaft between floors - not for moving people between floors, but bits.
 

ArkSingularity

macrumors 6502a
Original poster
Mar 5, 2022
928
1,130
The thing that I'm left wondering about from this video is the mention of "black silicon" or the "seemingly unused" portions of silicon on the chip. That seems expensive in a rising-wafer-cost world.

Does anyone have an idea there? In my layman's thinking, I have an idea that if a company ever started stacking dies (vs using chiplets), they'd need some kind of design for an elevator shaft between floors - not for moving people between floors, but bits.
It appears that there's a decent chunk of SRAM in the unlabeled portions of silicon. There are probably all kinds of things in there.
 

uller6

macrumors 65816
May 14, 2010
1,072
1,777
The thing that I'm left wondering about from this video is the mention of "black silicon" or the "seemingly unused" portions of silicon on the chip. That seems expensive in a rising-wafer-cost world.

Does anyone have an idea there? In my layman's thinking, I have an idea that if a company ever started stacking dies (vs using chiplets), they'd need some kind of design for an elevator shaft between floors - not for moving people between floors, but bits.
These “elevators” are called thru silicon vias, or TSVs. You are correct these are used for stacking dies - flash, DRAM, and I think the iPhone mates DRAM die on top of the CPU die already.

Dark silicon has been a major issue for nearly 20 years already. We hit the wall of Dennard scaling in the mid 2000s. But that’s brought us to heterogenous computing accelerators on-die, like dedicated ProRes or H.264 decode hardware.

 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
The very large display engines on Apple Silicon seem to explain why the base level M1/M2/M3 chips only support one external monitor, but I'm curious as to why the display engines are so much larger on Apple Silicon.
3840×2160 = 8,294,400 px (100%) 4K UHD
4480×2520 = 11,289,600 px (136.1%) 4.5K iMac
5120×2880 = 14,745,600 px (177.7%) 5K ASD
6016×3384 = 20,358,144 px (245.4%) 6K XDR

Do those Intel display engines support a 6K display at 60 Hz?
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
That's not true though.

He clearly mentions in the video that the area contains two display engines despite only labelling it as a (singular) display engine.

The revised annotated shot of the base M3 looks like this from his video.

View attachment 2315583

Except what he clearly mentions seems wrong. He's saying the Base, Pro, and Max contain 2, 4, and 8 display engines, respectively.

Yet how can the Pro have 4 display engines, when it can only drive 3 monitors? And how can the Max have 8 display engines, when it can only drive 5 monitors?

The more logical explanation is that each of those he has labeled are the individual large display engines (the ones that can drive the 6k externals), and the small display engine on each is unlabeled. Only then do the numbers work. I.e.:

Base: 1 labeled engine; can drive 1 x 6k and 1 x 5k
Pro: 2 labeled engines; can drive 2 x 6k and 1 x 4k
Max: 4 labeled engines; can drive (3 x 6k + 1 x 4k)* and 1 x 4k
*The reason the M3 Max MBP can't do 4 x 6k is that it has only 3 TB ports. But if you look at the M2 Studio, which has 4 TB ports, you'll see it can do 4 x 6k and 1 x 4k

1700673705771.png
 
Last edited:

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
and honestly, I'd even challenge the assumption that you're usually connected to power when connected to an external display — lots of people aren't. nobody wants their MacBook to waste 40% of battery during a presentation, or maybe even die; and you don't always have the ability to charge during one.
No one argues that efficiency becomes unimportant when you're plugged in. But you're missing the point here, which is that efficiency becomes much more important when you're not (because then you have the additional concern of battery life, which is not present when plugged).

and honestly, I'd even challenge the assumption that you're usually connected to power when connected to an external display — lots of people aren't. nobody wants their MacBook to waste 40% of battery during a presentation, or maybe even die; and you don't always have the ability to charge during one.
This doesn't make sense. When you are giving presentations you **are** connected to an external display (the room's main display) and, as you say, you would also likely be connected to power (or at least can easily plug in if needed). Thus you are **agreeing** with the idea that if you're connected to an external you are probably also connected to power (or have access to it), not challenging it.
 
Last edited:

Eric_WVGG

macrumors 6502
Oct 25, 2016
389
747
gentrification fallout zone
My first real "holy **** I'm in the future" moment with Apple Silicon was when I started futzing with the resolution scaling, and then plugged my laptop into an external display.

None of the bouncing around or flickering like on Intel and even PowerPC Macs, just seamless changes. Dang.

If any of that has to do with Apple's oversized display controllers — and again, this is a thing we've been living with for decades — that is absolutely a valid trade-off against running four displays on a flippin' $999 Macbook Air.
 

Confused-User

macrumors 6502a
Oct 14, 2014
850
984
I'm relatively unfamiliar with display hardware at a low level, so I have some questions...

At the moment, it would be prohibitively expensive to buffer an entire frame on-chip. A 6k monitor would require minimally 60MB (more likely 80MB, allowing for 10bpc). This isn't going to change in the next 5 years due to process technology; SRAM scaling is effectively dead, and eDRAM is not likely to be used. However, the industry is moving towards chiplets and complex packaging, so that leads me to wonder if it might not be feasible over time, even for the base Mx chips, to stack enough RAM on top of the base chip to hold entire frame buffers, much like AMD is stacking cache RAM on their X3D CPUs.

Is this likely to be a practical possibility in a few years? ISTM that this would dramatically lower DRAM utilization in the idle case, and would be a not insubstantial reduction in bandwidth use in all cases (5GBps for a 6k60 monitor).

Of course if you're going to do this you might want to move the entire controller off the base chip anyway, which might open up other possibilities.

ISTM that Apple has a big advantage over other architectures if they wanted to do this, because they'd have an easier time knowing when they'd need to copy from system RAM into the display buffer. Assuming they don't make the stacked-chiplet buffer the only buffer - I'm not clear on whether double-buffering would be better or worse here. (I mean, if the CPU or GPU is writing to the display, why even bother writing to main memory, if you have this buffer handy on-chip?)

As you can see I'm well out of my comfort zone here, so if this is utterly stupid feel fee to say so, though I'd prefer to know why.
 
  • Like
Reactions: ArkSingularity

pgolik

macrumors member
Sep 13, 2011
67
49
Because you are usually connected to power when running external displays.
Unless you’re giving presentations using a projector. When I finally upgraded from Intel to M3pro that was the first thing I noticed - how incredibly efficient AS is with external display. After 6h of lectures the battery was at 65%, on Intel I had to look for a power socket before 2 hours.
 

AlastorKatriona

Suspended
Nov 3, 2023
559
1,029
Snapdragon 8c and 8cx support dual 4K external. X Elite will support dual 5K.

Current Intel mobile chips support single 8K60 or quad 4K60.

Let's say Qualcomm and Intel display controllers use more power and M3 is extremely low. This feels like a butterfly keyboard situation where M3 is unreasonably skewed towards power consumption. The feature mix feels wrong. Most MacBook Air consumers aren't going to spend $3,000 for a single 6K monitor or use it to run games with ray tracing.
Feature mix seems exactly right to me. I highly doubt people buying base M-series chips are in need of external monitors at all, let alone more than one. Battery life on the other hand is exactly what they need, because its not at a desk plugged in to monitors (and power).
 
  • Like
Reactions: the future and Gudi

ArkSingularity

macrumors 6502a
Original poster
Mar 5, 2022
928
1,130
This doesn't make sense. When you are giving presentations you **are** connected to an external display (the room's main display) and, as you say, you would also likely be connected to power. Thus you are **agreeing** with the idea that if you're connected to an external you are probably also connected to power, not challenging it.
I mean, it's always nice to be able to skip bringing the charger if you're bringing a laptop around a lot. Constantly having to charge was definitely an added inconvenience for me in college.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
Unless you’re giving presentations using a projector. When I finally upgraded from Intel to M3pro that was the first thing I noticed - how incredibly efficient AS is with external display. After 6h of lectures the battery was at 65%, on Intel I had to look for a power socket before 2 hours.
I mean, it's always nice to be able to skip bringing the charger if you're bringing a laptop around a lot. Constantly having to charge was definitely an added inconvenience for me in college.
The key point is not whether or not you are connected to power when using externals, but whether you have easy access to it if needed. And you nearly always do. For instance, whenever I'm giving a lecture using a projector, I can always easily plug in my laptop. That's why the efficiency of the external controller(s) is not as critical as that of the internal.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.