Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's common knowledge for anyone who has used a PC in the past two decades.

If you've worked in an office of any kind that uses Dell desktops, those systems at minimum supported two displays. The Intel chip itself supported at least three. MacBook Pros of that era also used Intel chips. Here's a $42 Celeron launched in 2013 that supports three displays.


Transistor count? Ivy Bridge is 1.4B. M3 is 25B.

Well… it is also common knowledge that the resolution and the refresh rate supported by a Celeron processor are a fraction of those supported by a modern processor.
 
It's an assumption based on the estimated transistor counts in most of the 8th gen lineup of U-series Intel CPUs (most of which have under one billion transistors according to most estimates). Die shots of Coffee Lake era chips can be found on this page, which show the relative size of the system agent (which contains the display controllers, IO bus, among other such things.)

I am using the 8th gen CPUs primarily because this is what I am familiar with from the research I had done previously. If anyone knows where more precise transistor count information can be found, it would be very useful for making better comparisons on this front (Intel doesn't usually share official information for these, so we have to rely on estimations).

Fair enough, but what is the actual/estimated number of transistors used in the controller or the M3 vs the number of transistors used in the controller of the Intel chip?
Just looking at the overall transistor count for the entire CPU is probably deceiving.
For starters, the number of GPUs, CPU cores, dedicated engines, is different between the two CPUs. I say this in general terms as I don’t have neither the time nor the knowledge to make an accurate comparison of the two.

The point I am making is that, it seems to me that in this post people are making a comparison of “size” of CPU components, based on their relative area on the chip in comparison to the processing cores, and based on the comparison of the overall estimated transistors count on the CPU, without taking into account all the additional functions and components of a CPU compared to the other.

I cannot stress enough that this is just the impression I am getting. I am not suggesting I am right and you are wrong; Far from it. I am just asking for further explanations.
In any case, as from your original post, if the “sizes”are indeed significantly different it would be good to know why.
 
Last edited:
  • Like
Reactions: ArkSingularity
Fair enough, but what is the actual/estimated number of transistors used in the controller or the M3 vs the number of transistors used in the controller of the Intel chip?
Just looking at the overall transistor count for the entire CPU is probably deceiving.
For starters, the number of GPUs, CPU cores, dedicated engines, is different between the two CPUs. I say this is general terms as I don’t have neither the time nor the knowledge to make an accurate comparison of the two.

The point I am making is that, it seems to me that in this post people are making a comparison of “size” of CPU components, based on their relative area on the chip in comparison to the processing cores, and based on the comparison of the overall estimated transistors count on the CPU, without taking into account all the additions functions and components of a CPU compared to the other.

I cannot stress enough that this is just the impression I am getting. I am not suggesting I am right and you are wrong; Far from it. I am just asking for further explanations.
In any case, as from your original post, if the “sizes”are indeed significantly different it would be good to know why.
I mean, these are completely fair points. This whole thread relies on assumptions to some extent because it's difficult to get exact numbers out of Intel. The size difference is assumed from the fact that each display engine individually takes up nearly the area of two P-cores on apple silicon, whereas all display engines combined on Intel CPUs appear to take up a similar area to a single Intel P-core (so even without the numbers, it does at least strongly appear that there is a size difference).

This whole thread is really more of a "let's discuss and deep dive to learn about the chip" kind of discussion, we just don't have the exact numbers to really go beyond that. We'd need to talk to the engineers who designed these chips to do that.
 
  • Like
Reactions: Wando64
Well… it is also common knowledge that the resolution and the refresh rate supported by a Celeron processor are a fraction of those supported by a modern processor.
While that’s true, if the M series machines want to compete in the office environment, they should support the same or more displays as their competitors.

(Not that I think Apple is particularly interested in that market segment.)
 
While that’s true, if the M series machines want to compete in the office environment, they should support the same or more displays as their competitors.

(Not that I think Apple is particularly interested in that market segment.)
I imagine Apple probably expects people to just use an iMac for that use case (not sure how practical that is either).
 
If nobody is connecting an external display, then Apple wouldn't need to emphasize support for 6K monitor.

Not sure where you've been living for the past three years. Everyone is doing remote work or hybrid work. If you're at a desk, why wouldn't you have a monitor?
That's just not how it works. At all. A tiny fraction of people who buy laptops use them as pseudo desktops. Not sure where you've been for the entire history of computing, but that's how it breaks down.
 
Actually, it was an excellent argument. You simply didn't understand it. I'm talking about business use as it is now, where dual office externals are common, and where busineses would like to issue employees laptops instead of desktops so they can move easily between office and home. Yet for some bizarre reason you're referencing 1995, when people used single VGAs, and didn't commonly work from home.

The fact that the best you could do was to say "poor, poor, poor", rather than actually give a substantive response, shows how vapid your counterargument is. Seriously, do you somehow think the strength of your argument is in proportion to how many times you use that adjective?
The fact that you think I'm referencing 1995 specifically by referencing Windows 95 in office spaces tells me how little you know about the topic to begin with.
 
No? It's really common for businesses to deploy laptops and docks so employees can take them to meetings or home if need be. Something like a M1 MBA should be able to chew through basic office tasks.
That's not common AT ALL in commercial spaces. I don't understand why people post known false information as if it were true, just to make themselves feel like they have something to contribute.
 
That's not common AT ALL in commercial spaces. I don't understand why people post known false information as if it were true, just to make themselves feel like they have something to contribute.
Ever since 2020, every time the external display limit of the M1 (then M2 M3) is discussed, I feel like what you described was exactly happening, like every single time.

Of course no body in a thread here has concrete marketing data that Apple has, but, a lot of us work in the real world and have a pretty close handle of this. In fact even one external display capability is wasted on these MacBook Airs probably like 95% of the time. (again, that's pulled out of my semi-informed ass).

I mean it would be nice if Apple were to waste hardware budget on features that not everybody needs, but perhaps some users do, and I happen to be that user. I can see the same case being made for SD card slot and ray tracing cores, these are also not needed for 90%+ of targeted uses of MacBooks, but they exist on the base 14" now. So why being picky only for the display buffer?
 
Well… it is also common knowledge that the resolution and the refresh rate supported by a Celeron processor are a fraction of those supported by a modern processor.

We already gave you the hint that Celeron can support multiple external monitors.

The last Intel-based MacBook Air is powered by an Ice Lake processor, which is around 7 billion transistors per Intel. That MBA supports one 6K@60 or two 4K@60. Compare that with the 25 billion of M3.
 
Ever since 2020, every time the external display limit of the M1 (then M2 M3) is discussed, I feel like what you described was exactly happening, like every single time.

Of course no body in a thread here has concrete marketing data that Apple has, but, a lot of us work in the real world and have a pretty close handle of this. In fact even one external display capability is wasted on these MacBook Airs probably like 95% of the time. (again, that's pulled out of my semi-informed ass).

I mean it would be nice if Apple were to waste hardware budget on features that not everybody needs, but perhaps some users do, and I happen to be that user. I can see the same case being made for SD card slot and ray tracing cores, these are also not needed for 90%+ of targeted uses of MacBooks, but they exist on the base 14" now. So why being picky only for the display buffer?
It's really about cheap IT departments. It costs almost nothing to put 2 Full-HD 1920x1080p displays on peoples desk thereby giving the illusion of a huge desktop space. But compare 2 FHD displays at 4.15M pixels to a single 5K display with 14.75M pixels. Even with Apple's standard 2x retina scaling that still almost double the number of pixels on the 2 FHD displays. Even with a 4K display using Apple's 2x standard you get the same number of pixels and 4K displays are cheap and almost always significantly better displays than the 1080p displays most are talking about being standard on office desks.

With MacBooks it is better to use a good high-density display rather than two low-density but cheap displays.

Edit: My math is totally messed up with 4K and 5K Retina. Sorry for the misinformation.
 
Last edited:
  • Like
Reactions: MRMSFC
We already gave you the hint that Celeron can support multiple external monitors.

The last Intel-based MacBook Air is powered by an Ice Lake processor, which is around 7 billion transistors per Intel. That MBA supports one 6K@60 or two 4K@60. Compare that with the 25 billion of M3.

Thanks for "the hint".
You mentioned the Celeron as a relevant example, and I replied only to note that it might not be that relevant after all.

As for the transistor numbers, I am not going to repeat here what I already said in my other posts and how it should be seen in the context of what else the CPU is doing with them, such as number of CPU and GPU cores, number of GPU's EUs and ALUs, neural engines cores, hardware decoders, media engines, caches, etc...

Basically I think we should be comparing apples with apples, and even better Apple with Apple.
If Apple has used more transistors in their video controller there is a valid reason, and if there is a purpose to this thread it should be to understand it.
 
With MacBooks it is better to use a good high-density display rather than two low-density but cheap displays.
I feel this is the key take away, if we look at Apple's decision more in marketing than design. They would rather cater to creatives, when they plug in a display, chances are it is a modern one with 4k / HDR or even Apple's Studio / XDR displays. The business "productivity" setup crowd simply isn't what they are going for. Despite the fact a mini or an Air could be a pretty adequate deployment for that role, if so desired. Doesn't mean a realist market is there though.

Here is my personal anecdote: for my small music / publishing related studio, most seats are deployed with a single 27-28" 4k main display connecting to a Mac, then a 2nd 19" or less sub-FHD display connecting to a Dell OptiPlex mini PC. The Macs are for actual computing, while the Dells are for accessing a company accounting system. No machine connects to more than one external display at any time.
 
  • Like
Reactions: jdb8167 and Tagbert
I feel this is the key take away, if we look at Apple's decision more in marketing than design. They would rather cater to creatives, when they plug in a display, chances are it is a modern one with 4k / HDR or even Apple's Studio / XDR displays. The business "productivity" setup crowd simply isn't what they are going for. Despite the fact a mini or an Air could be a pretty adequate deployment for that role, if so desired. Doesn't mean a realist market is there though.
Speaking anecdotally as well, I suspect Apple is less interested in business productivity is that the big players such as Dell, HP, Lenovo, etc. offer big discounts and incentives to buy bulk pallets of el cheapo laptops and monitors as part of larger workstation orders.

They’re not likely to compete in the specific area that the workstation serves, and the people in charge of procurement are more likely to just take the cheap junk laptops over making another order from Apple.

So, taking that into account at the design phase, it makes more sense to play to their strengths, which is a single, high resolution display. (And thus, a large, but power efficient display engine)
 
Speaking anecdotally as well, I suspect Apple is less interested in business productivity is that the big players such as Dell, HP, Lenovo, etc. offer big discounts and incentives to buy bulk pallets of el cheapo laptops and monitors as part of larger workstation orders.

They’re not likely to compete in the specific area that the workstation serves, and the people in charge of procurement are more likely to just take the cheap junk laptops over making another order from Apple.

So, taking that into account at the design phase, it makes more sense to play to their strengths, which is a single, high resolution display. (And thus, a large, but power efficient display engine)
The Apple Business rep for my company talked about this a bit. The traditional, Dell-like business suppliers have way too much more room in bulk discounting that Apple simply doesn't (want to) compete. Then the same-day support, abundance of spare parts etc are also not something Apple can offer (only the iPad has express service where they send you a blank replacement before getting your broken one). So the rep said, instead they would rather cater to studios, or institutions where there probably aren't even a dedicated IT personnel handling.

I feel this must have factored into their design decisions. Also, chip design probably takes way too long for changes to happen soon enough, for instance the surge of demand for dual external displays during work-from-home happened so fast and ended also very fast, even if Apple wanted to react to this they just couldn't do it fast enough. We saw the same thing with FaceID mask unlock feature. When the feature was finally launched, the mandates already ended in most parts of the world.
 
I feel this must have factored into their design decisions. Also, chip design probably takes way too long for changes to happen soon enough, for instance the surge of demand for dual external displays during work-from-home happened so fast and ended also very fast, even if Apple wanted to react to this they just couldn't do it fast enough. We saw the same thing with FaceID mask unlock feature. When the feature was finally launched, the mandates already ended in most parts of the world
Generally agree with your sentiment but on the wfh front my intuition would be that that decreases the quantity of displays in use. When u work from home I just use my iMac. When I am in the office I use my MBP and an external. When my girlfriend is in the office she uses three displays counting internal. When at home just the laptop. My other colleagues in my office are a mixed bag but it’s either the same quantity at home or fewer. One going from five to two. Again all counting built in displays where applicable.
 
With MacBooks it is better to use a good high-density display rather than two low-density but cheap displays.
If by "better" you mean marvel at the great colors and high resolution, sure. But if you have a few tabs of reference information open for the document or project you are currently working on and need some screen space for a messenger because you are discussing stuff with your colleagues, then no, I'd take the two "low-density" displays any day.
 
  • Like
Reactions: ArkSingularity
These are not office computers. They're not replacements for the Windows 95 boxes plugged into the VGA monitors that populate office spaces. That's a poor, poor, poor argument and comparison.
That's just not how it works. At all. A tiny fraction of people who buy laptops use them as pseudo desktops. Not sure where you've been for the entire history of computing, but that's how it breaks down.
The fact that you think I'm referencing 1995 specifically by referencing Windows 95 in office spaces tells me how little you know about the topic to begin with.
That's not common AT ALL in commercial spaces. I don't understand why people post known false information as if it were true, just to make themselves feel like they have something to contribute.
What a pleasant person. Posts absurdly false things, insults anyone who points the falsehoods out.

Why did you start talking about Windows 95? What does it have to do with anything in 2023?

IT departments love to issue employees laptops as their only computer because they can take the whole thing to meetings. Some take notes with them, others present with them, etc. Laptops can also easily be taken home when there's a need to work on something outside normal work hours. The moment laptops got reasonably cheap and capable, corporate IT was all over them.

But most of the time these laptops live on the employee's desk with the lid closed and one or two external displays, mouse, and keyboard: in short, laptop-as-desktop. If you continue trying to pretend this practice isn't common as dirt, all of us with actual recent experience in corporate cube farms will heartily laugh at you. (Well, some of us may have already begun.)
 
What a pleasant person. Posts absurdly false things, insults anyone who points the falsehoods out.

Why did you start talking about Windows 95? What does it have to do with anything in 2023?

IT departments love to issue employees laptops as their only computer because they can take the whole thing to meetings. Some take notes with them, others present with them, etc. Laptops can also easily be taken home when there's a need to work on something outside normal work hours. The moment laptops got reasonably cheap and capable, corporate IT was all over them.

But most of the time these laptops live on the employee's desk with the lid closed and one or two external displays, mouse, and keyboard: in short, laptop-as-desktop. If you continue trying to pretend this practice isn't common as dirt, all of us with actual recent experience in corporate cube farms will heartily laugh at you. (Well, some of us may have already begun.)
The company I work for always sends laptops. Anyone who wanted a desktop had to specially request one.
 
What a pleasant person. Posts absurdly false things, insults anyone who points the falsehoods out.

Why did you start talking about Windows 95? What does it have to do with anything in 2023?

IT departments love to issue employees laptops as their only computer because they can take the whole thing to meetings. Some take notes with them, others present with them, etc. Laptops can also easily be taken home when there's a need to work on something outside normal work hours. The moment laptops got reasonably cheap and capable, corporate IT was all over them.

But most of the time these laptops live on the employee's desk with the lid closed and one or two external displays, mouse, and keyboard: in short, laptop-as-desktop. If you continue trying to pretend this practice isn't common as dirt, all of us with actual recent experience in corporate cube farms will heartily laugh at you. (Well, some of us may have already begun.)
Speaking from personal experience in the cube farm, those reasons plus the fact that most of the computing power is in servers that are remoted into makes laptops a perfect solution.

The dominance of laptops is because of the flexibility they offer, and the tradeoff in computing power becomes a non factor when the job involves running software on dedicated servers.
 
Assuming the M3 transistor counts are for the 'fully functioning'-binned chips (M3:25B, M3Pro:37B, M3Max:92B) - can anyone guess how many transistors are turned off on the 8CPU/8GPU Regular, 11CPU/14GPU Pro or 14CPU/30GPU Max?
 
What a pleasant person. Posts absurdly false things, insults anyone who points the falsehoods out.

Why did you start talking about Windows 95? What does it have to do with anything in 2023?

IT departments love to issue employees laptops as their only computer because they can take the whole thing to meetings. Some take notes with them, others present with them, etc. Laptops can also easily be taken home when there's a need to work on something outside normal work hours. The moment laptops got reasonably cheap and capable, corporate IT was all over them.

But most of the time these laptops live on the employee's desk with the lid closed and one or two external displays, mouse, and keyboard: in short, laptop-as-desktop. If you continue trying to pretend this practice isn't common as dirt, all of us with actual recent experience in corporate cube farms will heartily laugh at you. (Well, some of us may have already begun.)
FFS. A MacBook is not a corporate issue machine. End of discussion.
 
  • Haha
Reactions: JPack and Tagbert
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.