Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Those sales figures considered like that prove nothing about the decline of the PC market.


PC sales finally saw big growth in 2020 after years of steady decline

1/15/2021, 1:13 AM

During the Consumer Electronics Show this week, research firm IDC released a report on worldwide traditional PC sales in 2020, and it tells a rosier story than we've been used to in recent years. In the fourth quarter of 2020, PC shipments grew 26.1 percent over the same period last year.


That means 13.1 percent year-over-year growth overall, and the best year and quarter for PC sales in quite some time. In total, 91.6 million traditional PCs were shipped in the fourth quarter of 2020. "Traditional PCs" in IDC's report include systems like desktops, laptops, and work stations. For years, sales of these kinds of computers were declining at worst or growing negligibly at best even as other, newer computing gadget categories like smartphones, smart speakers, and tablets grew relatively rapidly.

IDC notes that the last time the market saw this kind of growth was way back in 2010, when modern multitouch smartphones were still building momentum and Apple's very first iPad had only just launched.

The growth was unsurprisingly largely "centered around work from home and remote learning needs," according to the report. But it also notes that segments unrelated to that, like gaming PCs and monitors, also saw significant growth over the course of the year. The overall growth is also partly due to the fact that "Chrome-based devices are expanding beyond education into the consumer market," according to IDC Vice President Ryan Reith.

=====


Because of the iPhone/Android PC replacement cycle lengthened from GAAP's 3 years to 4 years (per Apple) and 5-6 years (per Intel).

First decade of iPhone/Android typical consumer prefer to replace their phones every 2 years for a better camera. Last half decade saw a lengthening of replacement cycles of smartphone from 2 years to 3-4 years or longer because there hasnt been any meaningful reason to upgrade.

Steve Jobs did/said a lot of controversial things

- 1998: legacy-free PC, the year USB 1.0 & 1st iMac came out had no serial/paralle/PS2/floppy
- 2001: digital hub
- 2007: post-PC era, the year the 1st iPhone came out
- 2008: killing FireWire & 1st popular ultrabook without an Optical drive
- 2010: 1st time PC worldwide shipment fell & release of the 1st iPad
- 2012: last Mac without an Optical Drive that discontinued in 2016
- 2012: Steve Jobs refused to issue a dividend but Tim Cook did so this year after Jobs passing
 
Last edited:

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
If MS truly cared about Windows on ARM development, they should start on offering a simple product: Official Windows virtualization for Apple Silicon.
I'd love to see something more official, and better compatibility with windows on my M1
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
I'll enjoy seeing them catching dust on the shelves... I'm out of this thread

;-)

Picture-2-768x461.png


 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
;-)

Picture-2-768x461.png


I have a feeling that this chart underrates Apple Silicon's ability to grab a lot of market share and Qualcomm's aggressiveness in PC laptops.

Apple needs to release two more laptops:
  • 13" Macbook SE using the old Air design. Updated once every 2 years with newest base M SoC. $899.
  • 15.5" Macbook Air. $1499.
I could see Apple Silicon market share accelerating once these two laptops are released.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
I have a feeling that this chart underrates Apple Silicon's ability to grab a lot of market share and Qualcomm's aggressiveness in PC laptops.

Apple needs to release two more laptops:
  • 13" Macbook SE using the old Air design. Updated once every 2 years with newest base M SoC. $899.
  • 15.5" Macbook Air. $1499.
I could see Apple Silicon market share accelerating once these two laptops are released.
Read the end of the linked article.

From 2020-2022 COVID forced an upgrade of laptops & desktops or add more to anyone who needed a new unit.

Entering 2023 it will go back to a four year, 5-6 year, ~10 year or even >10 year upgrade cycle.

Any meaningful upgrade may occur as early as 2024 to as late as 2028.

When ARM PCs is executed & supported properly I expect x86 PCs to shrink to ~20% of the PC market as they cater to legacy software/hardware like mainframes have been for the last 2-3 decades.
 
Last edited:
  • Like
Reactions: Scarrus

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Read the end of the linked article.

From 2020-2022 COVID forced an upgrade of laptops & desktops or add more to anyone who needed a new unit.

Entering 2023 it will go back to a four year, 5-6 year or even ~10 year upgrade cycle.

Any meaningful upgrade may occur as early as 2024 to as late as 2028.

When ARM PCs is executed & supported properly I expect x86 PCs to shrink to ~20% of the PC market as they cater to legacy software/hardware like mainframes have been for the last 2-3 decades.
I assume that the chart shows yearly sales, not the total install base. So upgrade cycles don't matter as much (although Macs tend to last longer).
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
I assume that the chart shows yearly sales, not the total install base. So upgrade cycles don't matter as much (although Macs tend to last longer).
The Mac SKUs you propose will align with future annual worldwide shipments as they are not yet part of the total install base.

The four year Mac replacement cycle is based on Apple's observation of their whole install base. Same with the 5-6 year PC replacement cycle identified by Intel's CEO.

You are correct there are people like me who lengthen its useful life to after the final Security Update that occurs 8-11 years depending on the Mac. I do so because my use case has not changed since 2015.

I'm only upgrading because

- final Security Update was released more than 6 months ago
- preventive maintenance as some parts have failed & may lead to total failure
- halving power consumption and thermal output

I could tear it down and repair said failed parts and clean out any gunk that is clogging the HSF but its turning 10 this weekend and whatever time and money spent will not make it useful to me for at least 5 years. I should have made those improvements half a decade ago.

I'm preordering the iMac 27" replacement the hour of its release if one were to come by WWDC 2023.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664
I dont think what is presented by Apple as RAM limits are physical or even technical. It is more by design on what ~80% of the expected configurations would be per SoC line.

The limit is technological. There are simply no higher-capacity RAM modules of this type currently in production. A way to overcome this would be either increasing the RAM interface width (and thus the amount of used modules), or use more complex memory topology where a group of channel serves multiple modules (which will probably require some sort of nested memory controllers). None of this is feasible or economical.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
The limit is technological. There are simply no higher-capacity RAM modules of this type currently in production. A way to overcome this would be either increasing the RAM interface width (and thus the amount of used modules), or use more complex memory topology where a group of channel serves multiple modules (which will probably require some sort of nested memory controllers). None of this is feasible or economical.
My simplistic solution is just to simply have logic gates enabling or pulling the memory data channels into high impedance mode when not used. 1 inverter gate tied to the other bank of memory's address MSB line would be enough to double the amount of memory ... haha.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
The limit is technological. There are simply no higher-capacity RAM modules of this type currently in production. A way to overcome this would be either increasing the RAM interface width (and thus the amount of used modules), or use more complex memory topology where a group of channel serves multiple modules (which will probably require some sort of nested memory controllers). None of this is feasible or economical.

Then how did Apple bump up A12X's ram from 6GB in the 2020 iPad Pro then 3 months later upgraded that same chip to accept 16GB in the 2020 Developer Transition Kit?

It is more of make SKUs that >80% will actually buy rather than just <20%

It is like all those 2019 Mac Pro users demanding 1.5TB RAM on the future 2023 Mac Pro M2 Ultra single die or two die.

Is that 1 user per billion person? 1 user per 100 million person? How about 1 user per 10 million persons? May not be worth Apple's time to cater to so let AMD or Intel get their business.

Same reason why no MBA or Mac mini sells brand new through non-edu channels for under $999 or $599 respectively.
 
Last edited:

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Then how did Apple bump up A12X's ram from 6GB in the 2020 iPad Pro then months later upgraded that chip to accept 16GB in the 2020 Developer Transition Kit?

It is more of make SKUs that >80% will actually buy rather than just <20%

It is like all those 2019 Mac Pro users demanding 1.5TB RAM on the 2023 Mac Pro M2 Ultra single die or two die.

Is that 1 user per billion? 1 user per 100 million? May not be worth Apple's time to cater to so let AMD/Intel get their business.
Actually, it really depends on how many address lines is coming out of the SoC. That will limit the max amount of memory that it can support. In my very amateur understanding (gosh ... my logic design classes was so long ago ...) if the address lines coming out of the memory controller can address 512GB of memory for the M1 for example, there's no technical reason why it cannot have that much memory in a Mac. It is just that the designer would then have the build in the multiplexers in the logic board to multiplex the data to the SoC based on the address lines ... and of course take a hit with the latency introduced by the multiplexers.
 

leman

macrumors Core
Oct 14, 2008
19,516
19,664
Then how did Apple bump up A12X's ram from 6GB in the 2020 iPad Pro then months later upgraded that chip to accept 16GB in the 2020 Developer Transition Kit?

They didn’t upgrade the chip. They simply soldered on higher capacity RAM modules. Which were available at the time.

Apple is currently using up to 12GB per RAM module, so that’s maximum of 96GB for M2 Max. I don’t know whether there are 16GB modules on the market (note that Apple RAM is custom design order so there might be additional constraints). The point is: no matter how much you want 1TB RAM, it won’t magically make 64GB LPDDR5 modules appear on the market.

Probably the most feasible way to achieving high capacity and high performance is to use a tiered memory architecture. Like Intel does with the latest Xeons (HBM2 on package for performance , DDR channels for capacity).
 
  • Like
Reactions: sam_dean

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
They didn’t upgrade the chip. They simply soldered on higher capacity RAM modules. Which were available at the time.

Apple is currently using up to 12GB per RAM module, so that’s maximum of 96GB for M2 Max. I don’t know whether there are 16GB modules on the market (note that Apple RAM is custom design order so there might be additional constraints). The point is: no matter how much you want 1TB RAM, it won’t magically make 64GB LPDDR5 modules appear on the market.

Probably the most feasible way to achieving high capacity and high performance is to use a tiered memory architecture. Like Intel does with the latest Xeons (HBM2 on package for performance , DDR channels for capacity).
So what I said is true... it's by design. If no demand is available for the design then it becomes a technical limitation.

Like the two die M1 Ultra to make a M1 Extreme for the Mac Pro. The rumor may have originated from an internal test unit as UltraFusion was only present on 1 side of the M1 Max chips that were fab.

When the rumor mill kept on talking about Jade 4C-die I was wondering at what other side of the dies will they meet if there are no additional provisions for UltraFusion.

If someone's willing to pay for it Apple could do a one off of a 3nm Apple Silicon SoC that uses all 300mm of a silicon wafer then stick it into 2019 Mac Pro tower. It would probably fully utilize 1.5kW PSU's usable output power.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664
So what I said is true... it's by design. If no demand is available for the design then it becomes a technical limitation.

That’s one way to talk about it I suppose. But don’t forget that some things can be much more difficult and expensive to make. Demand is one thing, but if nobody can build it you still have a problem.

If someone's willing to pay for it Apple could do a one off of a 3nm Apple Silicon SoC that uses all 300mm of a silicon wafer then stick it into 2019 Mac Pro tower. It would probably fully utilize 1.5kW PSU's usable output power.

Sure, but would you want to pay like half a million for a computer? If that’s what you need there are better products.
 
  • Like
Reactions: sam_dean

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
That’s one way to talk about it I suppose. But don’t forget that some things can be much more difficult and expensive to make. Demand is one thing, but if nobody can build it you still have a problem.
Reminds me of Intel being stuck at 14nm from 2014-2020 when rivals moved to 10nm (2016), 7nm (2018), 5nm (2020) & 3nm (2023).

Intel became a PC OEM monopoly from 2006-2020. Hopefully with Qualcomm NUVIA it will induce competition the Windows PC space to push down Intel to ~20% of the PC marketshare if they refuse to reform.
 
  • Like
Reactions: Scarrus

Mr. Dee

macrumors 603
Dec 4, 2003
5,990
12,840
Jamaica
Apple isn't interested. Neither would Microsoft.

Microsoft is better served with Qualcomm NUVIA
I believe Microsoft is simply not interested because it is not worth the time or the effort. Hardware is so commoditized now, if you need to run mission critical applications or PC games, you can purchase an extra machine or gaming console just for that without breaking the bank. The needs are very niche. I have a Windows laptop with 32 GBs of RAM and I have an M1 MacBook Pro, I use either for specific needs.
 
  • Like
Reactions: sam_dean

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
I believe Microsoft is simply not interested because it is not worth the time or the effort. Hardware is so commoditized now, if you need to run mission critical applications or PC games, you can purchase an extra machine or gaming console just for that without breaking the bank. The needs are very niche. I have a Windows laptop with 32 GBs of RAM and I have an M1 MacBook Pro, I use either for specific needs.

I agree, I can't imagine Microsoft making much money from Windows 11 on Apple Silicon.

Many on this forum forget that their use case is a unicorn relative to the whole field.

They claim majority of users use Terminal and CLI... <20% of all Mac users at best.

They insist it isnt true because of emotional attachments.

It is like PC MasterRace types who think that RTX 40 GPUs are a standard in all desktops and gaming laptops are a waste of time.

They forget that they make up ~1% of all users as ~80% of the worldwide userbase buy laptops that are <$999.
 
Last edited:
  • Haha
  • Like
Reactions: Scarrus and jdb8167

Mr. Dee

macrumors 603
Dec 4, 2003
5,990
12,840
Jamaica
I agree, I can't imagine Microsoft making much money from Windows 11 on Apple Silicon.

Many on this forum forget that their use case is a unicorn relative to the whole field.

They claim majority of users use Terminal and CLI... <20% of all Mac users at best.

They insist it isnt true because of emotional attachments.

It is like PC MasterRace types who think that RTX 40 GPUs are a standard in all desktops and gaming laptops are a waste of time.

They forget that they make up ~1% of all users as ~80% of the worldwide userbase buy laptops that are <$999.
20 years ago, when you chose one platform, you basically were stuck with it for the rest of your life. Its not like that anymore, users can use a multitude of operating systems. Users are more operating system agnostic these days, too. You might have a Samsung phone, but use an iPad for leisure consumption and a Windows laptop for productivity and desktop work.

Back in 2001, owning a Windows PC and Mac was so rare partly because of how expensive it was. My worry these days, I probably have too much!

Mac users needed solutions like Virtual PC partly because of the stranglehold Windows had on the market and applications. LOB apps, Microsoft Office for Windows, ActiveX applications.

Several things cracked this:

- iPod - the first crack, I love Windows, but I want that
- Gmail
- Firefox
- iPhone
- Windows XP DDos, worms - some users just decided to give-up than wait on the fix SP2 and Longhorn

Part 2:

- Google Chrome
- Vista - the disaster
- Web 2.0: Gsuite, high speed networks
- Intel Mac transition, you could run Windows if you needed (boot camp) or Parallels.
- Social Media - decline in MSN Messenger and less interest in Email. Just log in and stalk or see what's going on

Part 3:

iPad - Why do I need a Windows PC to browse the web, look at pictures, send basic email, watch videos - very easy to use
Death of Flash
Even more powerful web apps and services: Spotify, Netflix, YouTube, Google Workspace
No more being tied up in proprietary file formats
 
Last edited:
  • Like
Reactions: sam_dean

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
20 years ago, when you chose one platform, you basically were stuck with it for the rest of your life. Its not like that anymore, users can use a multitude of operating systems. Users are more operating system agnostic these days, too. You might have a Samsung phone, but use an iPad for leisure consumption and a Windows laptop for productivity and desktop work.

Back in 2001, owning a Windows PC and Mac was so rare partly because of how expensive it was. My worry these days, I probably have too much!
I think that has more to do with

- improving income
- lowering cost of device relative to inflation
- work/study requirement

In the past 3 decades I wish I only had 1 phone or 1 computer at any given time. Time to upgrade? Liquidate!

Heck I wish I stuck to a 2011 MBP 13" 32nm then moved to a 2021 MBP 16" 5nm

32nm to 5nm! Woahhhhh!

Then 2021 MBP 16" 5nm to 2031 MBP 0.7nm (A7)
 
Last edited:

code-m

macrumors 68040
Apr 13, 2006
3,686
3,460
Many, including myself, did not know what we were looking at.

We were surprised that Apple maintained the Intel Mac mini 150W PSU until now even when the power consumption of the 2020 Mac mini M1 was <28W. When the M1 logic board was displayed we wonder why Apple kept the 1.39L enclosure volume.

I then concluded that it would cost more to redesign and switch to appropriate dimensions, Intel Mac mini uses cases are dependent on current form factor or a M Pro chip will eventually come to the Mac mini as it is was missing on the Mac Studio.

Until now I see the same set of persons thinking an Ultra in a MBP 16" will not work even when presented with what Intel is doing. What Intel did sold even when the madness and double thickness.

2006-2020 we put up with Intel. When surplus of Ultra chips are available why not cater to a segment that are willing to put up with madness without the double thickness.

5nm and future 3nm process node will reduce thermal output. A future Ultra chip will become cool enough not to need such a beefy HSF for silent PC requirements.
It might very well be possible that with the transition to 3nm we may see :

M3-M3Pro = MBA, iMac, MacMini
M3Max-Ultra = MBP, iMac Pro, MacStudio

M3Extreme = MacPro
 
  • Angry
  • Love
Reactions: sam_dean and Gudi

MrGunny94

macrumors 65816
Dec 3, 2016
1,148
675
Malaga, Spain
;-)

Picture-2-768x461.png


As soon Microsoft/Windows starts supporting it more and Qualcomm comes out with some laptops we will definitely get some ARM competition going.

ARM on the Business laptops is gonna be such a huge win as long as support is there, especially for folks out on the field.
 
  • Like
Reactions: sam_dean

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
As soon Microsoft/Windows starts supporting it more and Qualcomm comes out with some laptops we will definitely get some ARM competition going.

ARM on the Business laptops is gonna be such a huge win as long as support is there, especially for folks out on the field.

It hurts x86 more than Apple. As those who demand Windows 11 will stick to the best hardware that can run it.

In one competitive advantage Qualcomm could leverage their 5G/6G modem and out cost AMD/Intel.

By 2040 I see x86 dropping to near 20% PC market share because R&D of ~1 billion Android ARM SoC shipped annually worldwide (excludes any & all Apple devices) will make a drastic differentiator in that market.

By comparison x86 PCs ships this many annually worldwide

- 2021: 322.2 million units
- 2022: 263.7 million units

x86 will end up being the next mainframe in importance where in the main advantage to it would be for native legacy x86 hardware/software support for the platform.

What Microsoft and Qualcomm/Android ARM SoC makers need to work on are future Windows 11 fat binaries that allows programs to run on both x86 & ARM just like how Apple did it.

Heck, Apple already provided the workflow for that transition.

Intel's researched upgrade cycle is every 5-6 years. As the bulk of upgraders bought a new PC between 2020-2022 for remote work/learning then the next opportunity for ARM PCs to sell to those users would be 2025-2028. Between now and then is where they will make the minority of money.

Personally I wish I did the following upgrades

- 2011 MBP 13" 32nm > 2021 MBP 16" 5nm

Or if I accepted the handme down 2013 MBA

- 2011 MBP 13" 32nm > 2013 MBA 13" 22nm > 2023 MBA 15" 5nm

&

- 2012 iMac 27" 22nm > 2023 iMac 27" 5nm

Or if I accepted the half off 2015 iMac if I gave my 2012 to my uncle.

- 2012 iMac 27" 22nm > 2015 iMac 27" 14m > 2025 iMac 27" 2nm
 
Last edited:

257Loner

macrumors 6502
Dec 3, 2022
456
635
It counter intuitive, unless you understand how levels of integration work.

What we are witnessing is the same thing that happened when the microprocessor took over the mainframe/supercomputers.

The perception was the system that took a whole room and had lots of blinking lights had to be the more powerful. However, what was happening was that the microprocessor guys were integrating the same functionality that took lots of separate boards on a mainframe down to a few chips.

There were some very specific use cases where the mainframe had the edge, but for the 99% of the rest of the applications, we were ending up with system on our desktops that were faster than a computer who took a whole room. Heck, you can now buy a GPU for a $1k that is more powerful than the fastest supercomputer from 2000, which cost millions of dollars, took an entire floor in a datacenter, and used almost 1 megawatt.

The microprocessor vendors also had access to larger economies of scale, which meant they could spend more money in development of their designs/tech so they were able to overlap the old large system vendors who had slower development cycles and smaller revenues.

The same thing is now happening with SoCs. They are having larger levels of integration, so they can fit a whole PC into a single chip. Which means that things run faster, with less power, and less cost. And since they are leveraging the mobile/embedded markets that are larger and are growing faster than the traditional PC/datacenter stuff.

The SoC vendors are the ones with access to the larger economies of scale. So they are developing things faster.

Apple devices out shipped all Intel/AMD PCs combined. Apple only caters to the top ~20% of any market they enter. Apple leveraged iPhone & iPad SoC R&D to create >90% of Apple Silicon. <10% R&D for whatever Mac-specific requirements are paid for Mac revenue.

Which is why you end up with a mobile chip trading blows with a whole PC.

So you will see mobile SoCs getting more and more powerful at a faster rate than desktop microprocessors. And once they pass the inflection point, the desktop processor starts to actually lag in performance and can't catch up.

This has happened several times Mainframes -> Minicomputers -> Microcomputers -> SoCs... and it's usually correlated with jumps in levels of integration.
Your analysis convinces me that SoCs are the next big thing in silicon.

My next question is: When will the other companies jump on the bandwagon?​

When will Intel take iGPUs seriously?
When will AMD combine one of their CPUs and GPUs into an SoC?
When will Nvidia start making CPUs?
When will Qualcomm Snapdragons find their way into laptops and desktops?

But let me go ahead and say this: Apple beat them to the punch. They skated to the "puck's" next destination.​

So, when will the competition catch up?
 

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
Your analysis convinces me that SoCs are the next big thing in silicon.

My next question is: When will the other companies jump on the bandwagon?​

When will Intel take iGPUs seriously?
When will AMD combine one of their CPUs and GPUs into an SoC?
When will Nvidia start making CPUs?
When will Qualcomm Snapdragons find their way into laptops and desktops?

But let me go ahead and say this: Apple beat them to the punch. They skated to the "puck's" next destination.​

So, when will the competition catch up?

I'm not sure what you mean.
Intel and AMD have been taking iGPUs very seriously. So much so handhelds like the Steam Deck rely e.g, the AMD Ryzen for runing AAA games.

They'll never run like their dedicated GPU cousins, but having AAA games running on such small systems 15 years ago would be unthinkable.
 

257Loner

macrumors 6502
Dec 3, 2022
456
635
I'm not sure what you mean.
Intel and AMD have been taking iGPUs very seriously. So much so handhelds like the Steam Deck rely e.g, the AMD Ryzen for runing AAA games.

They'll never run like their dedicated GPU cousins, but having AAA games running on such small systems 15 years ago would be unthinkable.
If these other companies are so good at SoCs, then why aren't they competing with Qualcomm? I know Apple designs their own SoCs, but can't Intel, AMD, and Nvidia compete with Qualcomm for smartphone SoCs?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.