Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
If these other companies are so good at SoCs, then why aren't they competing with Qualcomm? I know Apple designs their own SoCs, but can't Intel, AMD, and Nvidia compete with Qualcomm for smartphone SoCs?

Because they don't have to. But also because Qualcomm took a share of the market first. It's not a matter of power here, but inertia. Or would you argue that the discrete GPU of a Qualcomm chip is more powerful than Intel's / AMD's iGPUs?
 

257Loner

macrumors 6502
Dec 3, 2022
456
635
Because they don't have to. But also because Qualcomm took a share of the market first. It's not a matter of power here, but inertia. Or would you argue that the discrete GPU of a Qualcomm chip is more powerful than Intel's / AMD's iGPUs?
Just because Tesla popularized EVs first doesn't mean Ford, GM, and Chrysler are waiving the white flag. They're making EVs, even though it has taken them a little longer. Ambitious people usually work at these big companies. I'd be surprised if they never tried to compete in the biggest silicon market in the entire world: smartphones.
 
Last edited:

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Moving this discussion back to the topic.


It seems like a new Macbook Air 13" is set to launch next quarter. This smells like an M3 update to the Macbook Air 13". And I would expect the 15" Air to launch with the M3 in this case.

We might not see the M2 Ultra and the M2 Extreme chips. Rumors are that the M2 Extreme was already canceled. This makes sense because it'd be weird to introduce an M2 Ultra & Extreme when the base M3 has already launched.

Edit correction: It appears that this article was referring to the 15" Air, not the current 13" Air.
 
Last edited:

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
3) The Mac business unit contributed the least in terms of revenue at. For 2022 they were estimated to have shipped 27.911 million units worldwide vs 226.4 millon iPhones in the same year.

aapl-1q23-pie.jpg
Quarterly results vary widely, and you can thus get misleading data by looking at individual quarters. For instance, if we instead looked at Q4 2022, we'd get 13% for the Mac division (https://tidbits.com/2022/10/27/apple-weathers-stormy-seas-in-q4-2022/):

1676447301578.png



At the very least, you should average over one year, which averages out the seasonal variation. Here are the results for 2022 as a whole, which indicate that the Mac division generated 9.8% of Apple's revenue. This is half again as large as the 6.6% figure from Q1 2023 (https://fourweekmba.com/apple-revenue-breakdown/)).

Last time I checked, the Mac division by itself would rank about 100th on the Fortune 500 list.

1676445945041.png
 
Last edited:

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
I've thought for a while that an annual update cycle for the Macs made perfect sense because (a) it would facilitate a synergy between the iPhone and Mac chip design teams, and (b) consumers like the predictability of the annual update cycle for iPhones, and would likewise like that for Macs.

In fact, it makes so much sense I'm guessing it was Apple's intent to do this, but supply chain and other issues have thrown them off their game. But if things go more smoothly in the future they should be able to get them synced up within a couple of years.
 
Last edited:
  • Love
Reactions: sam_dean

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Quarterly results vary widely, and you can thus get misleading data by looking at individual quarters. For instance, if we instead looked at Q4 2022, we'd get 13% for the Mac division (https://tidbits.com/2022/10/27/apple-weathers-stormy-seas-in-q4-2022/):

View attachment 2158896


At the very least, you should average over one year, which averages out the seasonal variation. Here are the results for 2022 as a whole, which indicate that the Mac division generated 9.8% of Apple's revenue. This is half again as large as the 6.6% figure from Q1 2023 (https://fourweekmba.com/apple-revenue-breakdown/)).
You are correct. Pardon the pie chart. I was looking for one that comes direct from Apple or at the very least nearest to Apple.
Last time I checked, the Mac division by itself would rank about 100th on the Fortune 500 list.

View attachment 2158892
What the Mac revenue does not reveal is where it derives >90% of their Mac SoC R&D money from.

It also does not show how it enjoys sharing other resources with other of Apple's business units and vendors. It benefits the economies of scale of purchasing SSD & RAM parts hence the peculiar use of Low Power DDR5 on desktops when traditionally you can use DDR5 DIMMs due to it being plugged to the wall rather than a battery.

It benefited from the tech & R&D spend of the 2007-2020 iPhone & 2010-2020 iPad. In terms of units shipped annually worldwide iPad & iPhone are nearly equivalent to all x86 PC of the last decade or so.

With the inclusion of the last 2 years of Mac SoCs Apple now ships more Apple SoCs than all x86 chips combined for the past 2 years. This means it has more R&D spend than both AMD & Intel combined.

That was how x86 outdid any and all mainframe companies. That is now how any SoC vendor will outdo AMD/Intel when they offer ARM laptops. Why laptops? Because SoC tech benefits laptops the most and ~80% of all PCs shipped worldwide annually are laptops. When they have excess capacity then maybe they'll make desktops.

If Mac business unit was an independent Fortune ~100 company it would not have the revenue to self sustain their Mac SoC R&D spend.

Their business would be closer to a PC OEM like Dell, HP, Lenovo, Asus, etc than that of a system vendor such as Apple.
 
Last edited:

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Your analysis convinces me that SoCs are the next big thing in silicon.

My next question is: When will the other companies jump on the bandwagon?​

When will Intel take iGPUs seriously?
When will AMD combine one of their CPUs and GPUs into an SoC?
When will Nvidia start making CPUs?
When will Qualcomm Snapdragons find their way into laptops and desktops?

But let me go ahead and say this: Apple beat them to the punch. They skated to the "puck's" next destination.​

So, when will the competition catch up?

As Apple has detached itself from Intel's CPUs and AMD's GPUs I have not been updated with their actual shipped products. Except for those highlighted by MR.

The problem with Intel originates from them becoming a monopoly of all PC OEMs from 2006-2020. This created a market environment that Intel stopped at 14nm from 2014-2020 because it just increased their capex without meaningfully benefiting the bottomline. Yes, Intel provided a multittude of technical excuses why the largest chip maker cannot deliver the next die shrink at a pace of their smaller rivals like TMSC. Notice that when Apple Silicon came out that they were able to move to 10nm.

Competition matters. Monopolies are only good for shareholders and the employees that answer to them. This is why Apple's success as a ARM PC system vendor provides a business case for Qualcomm, Mediatek and other SoC vendors to invest in ARM PC SoC. They just have to replicate the relevant parts of Apple's transition to their business.

Look up AMD's APUs some of which were used for Sony PlayStation 4, PlayStation 5, Microsoft Xbox One & Xbox Series X|S. I redirect my company's purchases of PCs to only those that uses AMD's 7nm laptop chips. So it can enjoy better performance per watt. Looking forward to future 5nm laptop chips from them.

Qualcomm's Nuvia 12-core laptop processor to debut in 2024 with a hybrid design and dGPU support. Nuvia was formed by 3 ex-Apple Silicon engineers. So expect similar or parity SoC with Apple when they are at the same process node. This is more dangerous for AMD/Intel than to Apple.

I think Nvidia tried and failed to buy ARM.

Apple's influence on the PC, tablet, smartphone, wearables, etc ends with their preferred price points that points to them only focusing on the top ~20% of any market as they deliver ~80% of all profits.

The strategy makes sense as it protects them legally so long as the bottom ~80% of the market can compete competently. If not then they grow to over top 50% of any market and are accused of monopoly.

Once they hit top ~20% of any given market they start looking for new or established markets that can leverage their IP, tech and business model. This is why Apple would never enter the PC Master Race market as they aren't a parts vendor like Intel/AMD/Nvidia.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664
Your analysis convinces me that SoCs are the next big thing in silicon.

My next question is: When will the other companies jump on the bandwagon?​

When will Intel take iGPUs seriously?
When will AMD combine one of their CPUs and GPUs into an SoC?
When will Nvidia start making CPUs?
When will Qualcomm Snapdragons find their way into laptops and desktops?

But let me go ahead and say this: Apple beat them to the punch. They skated to the "puck's" next destination.​

So, when will the competition catch up?

Intel and AMD have been making SoCs since around 2010. Game consoles have been SoCs combining CPU+large GPU+unified RAM since 2014 or so. Apple is actually fairly late to the party.

The reason why SoCs haven't really been a think in performance desktop computing is because desktop users expect modularity. They want to be able to choose and match components from different vendors and create personalised experience this way. And for your average gaming PC this approach works well enough. So the driving force is largely economical.

There are few things we see happening right now. First, as the laptop market becomes more important, the interest in more performance efficient systems is growing. Current integrated solutions from Intel and AMD area already good enough to have made entry-level gaming dGPUs obsolete. I envision this trend to continue going forward, with AMD and Intel introducing higher-end GPUs into their mobile products. Additionally, there are initiatives to develop a common chip-to-chip interface (UCIe), which would allow products combining CPUs and GPUs from different vendors in a single package.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Intel and AMD have been making SoCs since around 2010. Game consoles have been SoCs combining CPU+large GPU+unified RAM since 2014 or so. Apple is actually fairly late to the party.

The reason why SoCs haven't really been a think in performance desktop computing is because desktop users expect modularity. They want to be able to choose and match components from different vendors and create personalised experience this way. And for your average gaming PC this approach works well enough. So the driving force is largely economical.

There are few things we see happening right now. First, as the laptop market becomes more important, the interest in more performance efficient systems is growing. Current integrated solutions from Intel and AMD area already good enough to have made entry-level gaming dGPUs obsolete. I envision this trend to continue going forward, with AMD and Intel introducing higher-end GPUs into their mobile products. Additionally, there are initiatives to develop a common chip-to-chip interface (UCIe), which would allow products combining CPUs and GPUs from different vendors in a single package.
The SoC that Intel & AMD have been creating are crippled by comparison to any smartphone/tablet SoC because of their business models & their parts are placed into devices that are predominantly laptops, desktops and servers.
 

leman

macrumors Core
Oct 14, 2008
19,516
19,664
The SoC that Intel & AMD have been creating are crippled by comparison to any smartphone/tablet SoC because of their business models & their parts are placed into devices that are predominantly laptops, desktops and servers.

Crippled in which sense? Can you elaborate?
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Crippled in which sense? Can you elaborate?

- Raw performance
- Performance per watt
- BTU/hr


2020 M1 SoC's iGPU had the dGPU raw performance of a 2016 Nvidia GeForce GTX 1050 Ti & a 2017 AMD Radeon RX 560.

What they did not realize was that M1 had the highest performing desktop/laptop iGPU at its time of release.

For PC gamer types this was a joke because their thinking was that iGPU is garbage and the NVidia & AMD parts were 4 & 3 years old respectively.

~80% of all PCs are laptops and laptop users demand battery life. It is easier for a mobile SoC vendor to scale up a smartphone/tablet part to a laptop/desktop than a x86 parts vendor to scale down their desktop part to laptop/smartphone/tablet.

PC gamer types who buy RTX 4090 and 24-core Core i9 chips make up ~1% of all PCs shipped.

Then there was talk on MR & other forums about a desktop workstation-class SoC that would deliver RTX dGPU performance.

Almost everyone thought it would never occur because no one has ever done it.

They were correct as the business model of parts vendor differ greatly to systems vendor.

That turned out to be the M1 Ultra. Has AMD/Intel/Nvidia ever offered a laptop much less desktop chip with its very unique feature set? SoC that combines a Threadripper Pro + RTX 3090.

Yes, Apple's comparison to the RTX 3090 was specific to certain use cases that unique to the Mac that's why performance on other benchmarks did not match up. But in truth what buyers of the Mac should be concerned about is their use case and not a PC use case.

Parts vendors are pressured to sell cheap parts so they try to make the part's area/size as small as possible. They design it for that purpose and catch up in performance via electricity input that the user pays to the utility company. Seeming the power charges is not from Intel/AMD/Nvidia then it does not impact their sales.

PC master race types are focused on raw performance. They do not care about power consumption or even heat as the rich countries they live in have cheap power or as of a cold climate.

But if you live in a country with the highest $/kWh rate in the world or you do not pay for the power bills then who cares if your PC requires a 1.5kW PSU?
 

leman

macrumors Core
Oct 14, 2008
19,516
19,664
- Raw performance
- Performance per watt
- BTU/hr


2020 M1 SoC's iGPU had the dGPU raw performance of a 2016 Nvidia GeForce GTX 1050 Ti & a 2017 AMD Radeon RX 560.

What they did not realize was that M1 had the highest performing desktop/laptop iGPU at its time of release.

For PC gamer types this was a joke because their thinking was that iGPU is garbage and the NVidia & AMD parts were 4 & 3 years old respectively.

~80% of all PCs are laptops and laptop users demand battery life. It is easier for a mobile SoC vendor to scale up a smartphone/tablet part to a laptop/desktop than a x86 parts vendor to scale down their desktop part to laptop/smartphone/tablet.

PC gamer types who buy RTX 4090 and 24-core Core i9 chips make up ~1% of all PCs shipped.

Then there was talk on MR & other forums about a desktop workstation-class SoC that would deliver RTX dGPU performance.

Almost everyone thought it would never occur because no one has ever done it.

They were correct as the business model of parts vendor differ greatly to systems vendor.

That turned out to be the M1 Ultra. Has AMD/Intel/Nvidia ever offered a laptop much less desktop chip with its very unique feature set? SoC that combines a Threadripper Pro + RTX 3090.

Yes, Apple's comparison to the RTX 3090 was specific to certain use cases that unique to the Mac that's why performance on other benchmarks did not match up. But in truth what buyers of the Mac should be concerned about is their use case and not a PC use case.

Parts vendors are pressured to sell cheap parts so they try to make the part's area/size as small as possible. They design it for that purpose and catch up in performance via electricity input that the user pays to the utility company. Seeming the power charges is not from Intel/AMD/Nvidia then it does not impact their sales.

PC master race types are focused on raw performance. They do not care about power consumption or even heat as the rich countries they live in have cheap power or as of a cold climate.

But if you live in a country with the highest $/kWh rate in the world or you do not pay for the power bills then who cares if your PC requires a 1.5kW PSU?

Well, you are talking about x86 laptop SoCs vs Apple laptop SoCs. But this is very far from "any smartphone/tablet SoC". A modern x86 laptop SoC will run circles around an average smartphone/tablet SoC. Not in terms of power usage obviously, but raw performance, sure.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Well, you are talking about x86 laptop SoCs vs Apple laptop SoCs. But this is very far from "any smartphone/tablet SoC". A modern x86 laptop SoC will run circles around an average smartphone/tablet SoC. Not in terms of power usage obviously, but raw performance, sure.

Year 2017 SoCs....

22855-28192-A11geekbench-xl.jpg


Source: https://appleinsider.com/articles/1...erates-top-chips-from-qualcomm-samsung-huawei


2013-2021: iPhone chips vs Intel desktop chips

perf-trajectory.png


Source: https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive/4

The reason why Apple has the efficiency edge is mainly due to Apple having an edge in terms of the process node (Intel's 14nm/10nm vs Apple's 5nm/3nm) they use, the PDN (power delivery network) tech, and packaging.

=======

Aligning Mac SoC's refresh cycle to those of the iPhone SoC would optimize supply chain efficiency further. That's why this rumor is so plausible.

It is like the rumors as early as 2017 of Apple transitioning from Intel to Apple chips. By that time Apple had a decade of iPhone SoC R&D under its belt. Unit volume of which is equivalent to Intel's x86 chips.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664

Wasn't your original argument that Intel SoC's are worse than phone SoCs? Yet the data you show illustrates that x86 is much faster than any non-Apple SoC. This has changed a bit in recent times as Android flagships started integrating fast ARM laptop cores of course.

The reason why Apple has the efficiency edge is mainly due to Apple having an edge in terms of the process node (Intel's 14nm/10nm vs Apple's 5nm/3nm) they use, the PDN (power delivery network) tech, and packaging.

Yes, all these things matter, but the main reason is still because Apple specifically optimises for high performance low frequency operation due to its extremely wide and deep core design.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Wasn't your original argument that Intel SoC's are worse than phone SoCs? Yet the data you show illustrates that x86 is much faster than any non-Apple SoC. This has changed a bit in recent times as Android flagships started integrating fast ARM laptop cores of course.
As far back as 2017 phone SoC were nearing raw performance at a fraction of a laptop battery's mAh.

When anyone thinks that way then you will pause and realize that the tech & R&D put into phone SoC is far more superior than ant Intel chip with an iGPU.

Also another meaning for "crippled by comparison to any smartphone/tablet SoC because of their business models"

Apple is a system's vendor. Meaning that they sell the finished product, not just the processors. So they can use several parts from the vertical process to subsidize others. In this case, Apple can afford to make very good SoCs because they don't sell those chips elsewhere, meaning that they are not as pressured to make them "cheap" in terms of area for example. Since they're going to recoup the profit from elsewhere in the product.

In contrast; AMD and Intel sell their processors to OEMs, so they only get profit from the processor not the finished system. So they have to prioritize cost, by optimizing their designs for Area first and then focus on power. This is why both AMD and Intel use smaller cores, which allows them for smaller dies. But which have to be clocked faster in order to compete in performance, unfortunately that also increases power.

This is probably their key difference; Apple can afford the larger design that is more power efficient for the same performance. Whereas AMD/Intel have to aim for the smaller design that is less power efficient for the same performance.

Yes, all these things matter, but the main reason is still because Apple specifically optimises for high performance low frequency operation due to its extremely wide and deep core design.

I agree with you.

Apple ARM cores are actually more complex than the x86 competitors; significantly wider and with larger resources for out of order and speculation. Most people assume there is some kind of "magic" that makes ARM better that x86, but that is not the case. The ISA has little impact on overall power consumption given the same microarchitectural resources.

Apple uses their larger/more complex cores to their advantage, by running them at a slower clock rate. While allowing them to do more work per clock cycle. This allows them to operate on the frequency/power sweet spot for their process. One has to note that power consumption increases significantly (way higher than linear) the higher the frequency.

That's why when Apple wants to double performance of a M1 Max then they make a M1 Ultra.

M2 Extreme is being rumored for the Mac Pro. This would linearly quadruple M2 Max performance.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664
As far back as 2017 phone SoC were nearing raw performance at a fraction of a laptop battery's mAh.

When anyone thinks that way then you will pause and realize that the tech & R&D put into phone SoC is far more superior than ant Intel chip with an iGPU.

It's not as simple. Sure, a Cortex X-3 in the current Snapdragon 8 gen 2 approaches the single-core performance of current entry-level laptop x86 CPUs, but the power consumption has also increased significantly. I mean, Android hone makers are now disabling the "fast" core when running regular applications because it draws too much power to be used in a smartphone. So no, I can't say I agree. These are devices designed to solve different problems, I wouldn't claim that any of this technology is inherently superior. Apple of course is in its own category, but they are pretty much the only vendor that can deliver desktop-class performance with smartphone-like power usage.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
It's not as simple. Sure, a Cortex X-3 in the current Snapdragon 8 gen 2 approaches the single-core performance of current entry-level laptop x86 CPUs, but the power consumption has also increased significantly. I mean, Android hone makers are now disabling the "fast" core when running regular applications because it draws too much power to be used in a smartphone. So no, I can't say I agree. These are devices designed to solve different problems, I wouldn't claim that any of this technology is inherently superior. Apple of course is in its own category, but they are pretty much the only vendor that can deliver desktop-class performance with smartphone-like power usage.
Different business models induces them to behave X Y and Z rather than A B or C.

We can atomize the argument to the transistor-level but that's just a particle of the whole package.

Any of the CPU or GPU makers can replicate the tech of Apple given the R&D money as they have similarly talented manpower.

Difference is that their business model forces them to do certain things a certain way to be competitive and profitable.

Parts vendor vs System vendor.

Apple haters will say you cannot upgrade CPU, GPU, RAM & SSD by making Macs into an overgrown iPhone/iPad but these customers are largely PC gamers or gear heads who buy PC parts more often than new articles of clothing.

Nothing wrong with that but it is just a different use case that a minority of users do.

The danger of this transition of Windows PCs to ARM is that the economies of scale of x86 PCs will drop to what I think would be ~20% of what it is today within 2 decades.

Why? Because ~80% of all PCs shipped worldwide annually are laptops.

x86 will only appeal to users with legacy hardware/software support and maybe include those who want to upgrade more often than they see their estranged parents.
 

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
Apple haters will say you cannot upgrade CPU, GPU, RAM & SSD by making Macs into an overgrown iPhone/iPad but these customers are largely PC gamers or gear heads who buy PC parts more often than new articles of clothing.

This paragraph is very confusing.
The way it is worded, it sounds like you can upgrade CPU, GPU, RAM and SSD by making Macs into overgrown iPhones / iPads.

What does that even mean?
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
This paragraph is very confusing.
The way it is worded, it sounds like you can upgrade CPU, GPU, RAM and SSD by making Macs into overgrown iPhones / iPads.

What does that even mean?

I'll rephrase it.

Apple is making Macs into overgrown iPhones & iPads in the sense that you cannot replace any parts inside for a future upgrade.

These parts that cannot be altered would be

- CPU
- GPU
- RAM
- SSD

Those pointing these out tend to be PC gamers or gear heads who do those parts upgrades more often that they buy a new

- tshirt
- underware
- socks
- pants
- shorts
- shoes
- belts
- jackets
 
  • Like
Reactions: AlphaCentauri

salamanderjuice

macrumors 6502a
Feb 28, 2020
580
613
I'll rephrase it.

Apple is making Macs into overgrown iPhones & iPads in the sense that you cannot replace any parts inside for a future upgrade.

These parts that cannot be altered would be

- CPU
- GPU
- RAM
- SSD

Those pointing these out tend to be PC gamers or gear heads who do those parts upgrades more often that they buy a new

- tshirt
- underware
- socks
- pants
- shorts
- shoes
- belts
- jackets
It's all about waste reduction. Why throw away perfectly good stuff when the old stuff works just fine with an extra $100 part? Same goes for the clothes. A LOT of energy goes into producing new tech. Most of the energy used in the lifetime in a laptop goes towards it's manufacture for example. If an extra 16GB RAM or 1TB storage can keep it in service longer well then great. Way better for the environment to keep using what already works with small modifications than to frequently replace devices even if those devices are marginally more power efficient.
 

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
I'll rephrase it.

Apple is making Macs into overgrown iPhones & iPads in the sense that you cannot replace any parts inside for a future upgrade.

These parts that cannot be altered would be

- CPU
- GPU
- RAM
- SSD

The advantage of a modular PC is not just for "gamers" who want "the latest specs".
Modular parts increase repairability.

Here's a real-life example: last year, a power failure damaged my motherboard components and made it unstable, so it would reboot randomly. I had to take it to a shop to get the parts tested and buy a new part.

It stung, but what if it were a Mac Studio? I'd probably have to buy a shiny new one because I can't just replace the motherboard (logic board, in Mac terminology) and keep the SSD. Apple probably would refuse to cover damage due to a power failure.

Even if they did, once the warranty expires, you are toast.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
It's all about waste reduction. Why throw away perfectly good stuff when the old stuff works just fine with an extra $100 part? Same goes for the clothes. A LOT of energy goes into producing new tech. Most of the energy used in the lifetime in a laptop goes towards it's manufacture for example. If an extra 16GB RAM or 1TB storage can keep it in service longer well then great. Way better for the environment to keep using what already works with small modifications than to frequently replace devices even if those devices are marginally more power efficient.
Not arguing the positives for the planet or the user but...

Typical user will just replace the whole thing

- per Apple: four years
- per Intel: 5-6 years
- after final Security Update: ~10 years (like me)
- after OLCP patcher: ~20 years (other users who encourage me to be like them)

The clothes comparison is to highlight the spending priority of the PC gamers or gear heads. They prefer modular PC desktop so they can buy the next RTX 30, 40, 50, 60, 70, 80, 90, 100, etc dGPU while retaining almost all compatible parts.

But that sort of user is quickly disappearing as demonstrated by

- ~80% of PCs being shipped annually worldwide are laptops
- annual worldwide shipment of PCs have been dropping since 2007 and only recovered during 2020-2022

~80% of that ~80% are ultrabook buyers that tend to buy x86 laptops that share design decisions of the MBA.

If the only tool you have is a hammer, it is tempting to treat everything as if it were a nail. So from a PC gamers or gear heads any SoC with a iGPU is a joke as all PCs to them have a dGPU. Even when annual worldwide shipping figures never reflects that.

TL;DR: Not being able to replace parts for the purpose of upgrading isn't an issue for users who are not PC gamers or gear heads
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
The advantage of a modular PC is not just for "gamers" who want "the latest specs".
Modular parts increase repairability.

Here's a real-life example: last year, a power failure damaged my motherboard components and made it unstable, so it would reboot randomly. I had to take it to a shop to get the parts tested and buy a new part.

It stung, but what if it were a Mac Studio? I'd probably have to buy a shiny new one because I can't just replace the motherboard (logic board, in Mac terminology) and keep the SSD. Apple probably would refuse to cover damage due to a power failure.

Even if they did, once the warranty expires, you are toast.
Sadly for the typical buyer it appears to be not to be a priority.

I also suspect that Apple has difficulty making repairability economically viable for them.

Could be because of cost of labor related to repairs or quality of repair job that necessitates a repeat of repair job or even giving a refurb replacement outright because repair person did the botched job.

Would not be surprised if there are cases of fraud when the repair person claimed fraudulent parts request to sell it after the fact.

Modular PCs has been a topic of much discussion for quarter of a century with Apple users. I think people's purchasing decisions drives Apple & other companies to make specific design choices.
 
  • Like
Reactions: Joe Dohn

salamanderjuice

macrumors 6502a
Feb 28, 2020
580
613
Most x86 laptops still let you upgrade the SSD in the form an M.2 slot. I'm actually struggling to think of one that has soldered storage like a modern MacBook even small devices like the Steam Deck don't. There's even some that have multiple M.2 slots. The WiFi card is also typically upgradable too.

Right now soldered RAM is somewhat common but it's not hard to find PC laptops with RAM slots still. And that may change again with the upcoming CAMM RAM standard which will allow for things like replaceable LPDDR5 and thinner laptops than SO-DIMM slots allow while increasing speeds.
 
  • Like
Reactions: sam_dean

bcortens

macrumors 65816
Aug 16, 2007
1,324
1,796
Canada
The advantage of a modular PC is not just for "gamers" who want "the latest specs".
Modular parts increase repairability.

Here's a real-life example: last year, a power failure damaged my motherboard components and made it unstable, so it would reboot randomly. I had to take it to a shop to get the parts tested and buy a new part.

It stung, but what if it were a Mac Studio? I'd probably have to buy a shiny new one because I can't just replace the motherboard (logic board, in Mac terminology) and keep the SSD. Apple probably would refuse to cover damage due to a power failure.

Even if they did, once the warranty expires, you are toast.

This isn't really requiring modularity though. If you really want repairability then advocate for the right to desoldering of chips and more widespread training for this kind of repair. That motherboard you threw out could probably have been fixed too if the tools were availability. Apple should be doing this as well, taking back a whole damaged component, identifying the part that actually failed, and replacing it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.