Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Disappointed with Mac Pro 2023?


  • Total voters
    534

Longplays

Suspended
May 30, 2023
1,308
1,158
I think it's fairly obvious that the architecture is designed around 8/16/24GB capacities in the M2, then you double that and have 16/32/48GB in the M2 Pro, then double that again and have 32/64/96GB in the M2 Max, and then double that again to 64/128/192GB in the M2 Ultra.

My guess is that, if they wanted to support more than 192GB of unified memory, they would have needed to do something other than just doubling (or design the base M2 for more than 24GB), and that would have been very costly.

Something that allows DIMMs would basically break the whole architecture... plus think of another thing. A memory controller that allows DIMMs needs to be able to handle various specs of RAM, various timings, various multichannel modes, etc. Who knows how many mildly different grades of DDR4 might technically be used on, say, a typical Intel memory controller. These memory controllers have gotten more tolerant over time, too - 30 years ago, the wrong spec/combination RAM and you didn't boot, whereas today, memory controllers will run a lot of less-than-optimal configurations in lower performance modes. Obviously, Intel has decades of experience designing memory controllers and they just accept that that's the requirement. Intel will even give you a single integrated memory controller that, depending on your motherboard vendor's preference, is able to do either DDR4 or DDR5, so who knows how many 'wasted' transistors there are on all those CPUs for the unused memory support. If you solder the RAM like Apple does, you can land on one spec of RAM chips, design the memory controller for that spec, buy only that spec from the RAM manufacturer, and you've removed all that complexity from the memory controller.

I am not an engineer, but one thing I thought they could have done is some kind of proprietary memory module, somewhat like some PowerBooks had in the early 1990s. Somewhat like what they do with the "SSDs" on the Mac studio where the SSD chips are basically an extension of an on-SoC storage controller. But they didn't do that...
These chips are not made in a vacuum.

Apple hired people to look up use cases today & within the observed replacement cycles of the future.

As such they prioritize what they're good at, what can be saved and what is new into the horizon.

If not then their competitors get to them.
 

theluggage

macrumors G3
Jul 29, 2011
8,011
8,444
In Windowsland, other than home-built machines, the best example of this were the built-to-order machines that made Dell/Gateway in the late 1990s. Intel motherboard with no onboard anything beyond IDE/parallel/serial, you got a discrete graphics card, a discrete sound card, a discrete modem, a discrete network card pre-installed in your PCI slots. Those machines look so weird today, coming from the factory with potentially 4-5 slots used.
That's a really good point. Back in the 90s/early 00s I was assembling my own PCs and also remember plugging in things like hard drive adapters (in the 80s, for any hard drive - later on if you wanted more than 2 hard drives, or ATA 66) and SCSI cards (some scanners used to come with SCSI cards, you needed them for early writable CD drives) or FireWire cards (yes, on PCs if you wanted to edit footage from your MiniDV camcorder). There were also more reasons for upgrading things like the GPU for prosumer purposes - e.g. I got a new GPU for my work PC because I needed it to run dual DVI displays, there were hugely popular things like the Matrox Mystique/Rainbow Runner combo video capture/editing which was about the first consumer-priced full screen ("better than VHS quality") video editing system.

The point is that, back then, expansion slots in a PC were a must-have for a large proportion of users, including many consumers and most "prosumers"/hobbyists. Since then, a lot of that functionality has migrated first to the motherboard and increasingly - with Apple Silicon - onto the system-on-a-chip.

A lot of on-board interfaces are also "good enough" - e.g. old RS232/parallel, even USB 1 were never really adequate for external drives and any sort of 3D gaming needed a better-than-stock GPU. Any full-screen video editing (even touching up your holiday vids) would likely need additional hardware and a better-than-base-level machine. By contrast, today, USB 3 came out in 2008 and still provides more bandwidth than is needed by any single mechanical hard drive, or even most bog-standard consumer SSDs. Only a minority of users need to pay the premium for even a thunderbolt device - and while a SSD plugged into a 16x PCIe V4 slot sounds awesome there's only so many 4k video streams a person can watch. The cheapest MacBook can run 3D games (maybe at reduced, but still playable, quality levels) and edit your iPhone-shot videos out-of-the-box.

I know people who use PCs professionally and always used to depend on a full-sized PCIe tower or two - but are now switching to mini-pcs for everything but 3D gaming.

So it's not that people calling for better support for "pro" applications don't personally need the power they're asking for - it's that they're becoming a smaller and smaller fraction of the overall market - and with "consumer" electronics, economies of scale are everything. I'm not saying that Apple's $7000 price for a $4000 Mac Studio with PCIe slots and a grotesquely over-engineered case is justified but they'll certainly be making them in tiny quantities compared to the studio, which itself will sell in tiny quantities compared to the MacBooks.

In Macland, I would probably guess (not having been on the Mac side at the time) that the peak of this era was the B&W G3 and the G4s.

Probably, as in they were reasonably affordable Macs with PCIe - but they were still a "pro" alternatve to laptops and the iMacs - I don't think there was ever a time where the low/mid ranged Mac relied heavily on expansion slots the way mainstream PCs did. Mostly the lower-end Macs were either all-in-ones or "pizza boxes" - "classic" Macs sitting on top of SCSI drive enclosures were a common sight and (I don't know about the very first Mac) usually had on-board SCSI and LocalTalk networking with a single slot for an Ethernet card. It was the higher-end Macs that had first NuBus then PCIe slots - and for the first half of Mac history "workstation" probably meant a Sun or SGI Unix box, and high-end Macs were closer to those than 80s-to-early-90s PCs.

And the last gasp of that type of machine was the 2010 Mac Pro. A little too expensive and workstationy, but it was still a modular half-affordable desktop in a way that nothing that followed was.

The 2006 was pretty keenly priced for a Xeon/ECC workstation (RAM was expensive - but not so much from Apple gouging, more because it used FB-DIMM modules that cost a packet even from third parties).

It may come to a shock to many but shipments of desktop dGPU have been at a downward slope since 2005.
Not really a shock - again, relatively basic GPUs have been becoming "good enough" for all but specialist purposes (it's nice to game in 4k, but you can have a lot of fun at 720p) and higher-end GPUs have been becoming more and more about GPU-based computing and less about displaying big spreadsheets.

Mac desktops, on the other hand, are underwhelming. Apple doesn't have a satisfying answer to the basic question: If you already have a Mac laptop, why should you buy a Mac desktop?

Unless you have no need for mobility or need the power of the M2 Ultra, the straight answer is you probably shouldn't - and I think that's what they're working to. I think part of the reason for the demise of the 27" iMac is that a MacBook Pro can now offer the same computing power. MBP owners don't need an iMac for extra performance, but they might want a Studio Display when they're using their MBP on a desk - it's clearly designed as the ultimate MacBook Pro docking station - a lot of money has gone on cramming in a power supply big enough to power a MBP, which is totally wasted in a desktop setup.

Question is, what else can Apple do in the "pro machines for real pros" sector? AMD Threadrippers are offering silly numbers of cores and insane PCIe bandwidth. Even back in 2019, if you looked beyond the Xeon-W to the "scalable" multi-CPU capable versions, there were more powerful systems with more cores, more PCIe lanes and higher memory capacity. Many people calling for better Mac Pros are also calling for NVIDIA GPUs, and Apple seem to have burned their bridges on that one.

Apple could keep using Intel in the pro, or switch to AMD, and keep using AMD GPUs - in which case they'll have a system which is exactly as powerful as cheaper, larger-volume generic PCs using the same chipsets, or they could funnel a huge amount of cash into making an ARM-based Threadripper-killer just for their smallest selling machine. (No, a M2 Extreme or a M3 Ultra isn't going to support 1TB+ of RAM or 128 PCIe lanes, let alone beat that - they'd need a new die - plus, going for external GPUs and RAM throws away some of the power consumption vs. performance advantages of Apple Silicon).
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
It may come to a shock to many but shipments of desktop dGPU have been at a downward slope since as early as 2005. This to me indicates that "perfect" 4090-like performance is not selling as well as GPU cores in a SoC.
The more I think about it, the more sense that makes.

In Windowsland, prior to the i810 in around 2000, you didn't really have integrated graphics. You either got a lousy el-cheapo chip soldered onto your motherboard, which is what most of the lower-end brand-name clones did, or you got a discrete GPU in a PCI/AGP/etc slot, which is what Dell, Gateway, or the computer shop down the street building you a system did.

And... I am trying to remember this, but I think, for example, that the first integrated graphics chipset for the Pentium 4 shipped... at least a year or two... after the Pentium 4 was introduced. So there was certainly a time when integrated graphics existed but you couldn't get integrated graphics on the cutting edge processor. Similarly, integrated graphics boards tended to have less RAM capacity - if you wanted more RAM, you needed the chipset/board without integrated graphics.

By 2005, I would have said that integrated graphics were well-established in basically all entry-level consumer machines and all non-workstationy business machines. So that leaves only gaming and graphics-intensive work applications requiring discrete GPUs, so I would expect a big drop in GPU volumes. That being said, you could still get an optional soldered-on GPU on business laptops.

And you see this in the marketplace - there are far fewer low-end PCI-E graphics cards available for sale down at the computer store. Giant monsters with two-slot coolers and add-in power connectors, sure, but a plain vanilla passively cooled single-slot card is... rare... and most of the models you might find are based on more dated GPU architectures. There would have been a time, maybe in the mid-2000s, where you needed a discrete graphics card to run dual monitors, so that would have created some lower-end need for discrete cards in a business environment, but Intel added dual monitor support to their integrated graphics a looooooong time ago. It wouldn't surprise me if half the low-end cards in the computer store were for enthusiasts building HTPCs who just want a discrete GPU because integrated graphics have a very bad reputation among 30/40-something-year-old enthusiasts.

The other thing, of course, is that GPU lifecycles/prices/etc have gone up. Plenty of gamers upgraded their GPUs on a more frequent cycle in the 2000s, then kept them longer in the 2010s/2020s, so that would also lower volumes.

I'm sure the graph for sound cards would be even worse. In 1995, almost all Windows machines with sound capability had a discrete sound card, by 2000, only higher-end Windows machines had a discrete sound card, and then the redesigned Vista audio architecture in 2007 basically decimated discrete sound cards. Fundamentally, cheap mediocrity tends to dominate all, given enough time...
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
These chips are not made in a vacuum.

Apple hired people to look up use cases today & within the observed replacement cycles of the future.

As such they prioritize what they're good at, what can be saved and what is new into the horizon.

If not then their competitors get to them.
Sure, but what end of the puzzle did those people start on?

I think it's more likely that they started on the M2 end, landed on 8/16/24GB as the appropriate targets for the M2, and then the rest of the lineup doubles itself.

Or do you think they started by saying "oh the M2 Ultra will need 192GB" and then divided each chip by 2 until they landed with 8/16/24 on the base M2?

Just look at the sales volumes - it makes more sense to design the high-volume chips to reflect that market as closely as possible, and if that means that doubling and doubling and doubling doesn't lead to the ideal Ultra chip for a small number of people, so be it.

And ultimately, I'm sure they do not care that their competitors will win by default in the >192GB of RAM, discrete-NVIDIA/AMD GPU-requiring workstation market. Too expensive to adjust their overall product plans to play in that sphere.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
The more I think about it, the more sense that makes.

In Windowsland, prior to the i810 in around 2000, you didn't really have integrated graphics. You either got a lousy el-cheapo chip soldered onto your motherboard, which is what most of the lower-end brand-name clones did, or you got a discrete GPU in a PCI/AGP/etc slot, which is what Dell, Gateway, or the computer shop down the street building you a system did.

And... I am trying to remember this, but I think, for example, that the first integrated graphics chipset for the Pentium 4 shipped... at least a year or two... after the Pentium 4 was introduced. So there was certainly a time when integrated graphics existed but you couldn't get integrated graphics on the cutting edge processor. Similarly, integrated graphics boards tended to have less RAM capacity - if you wanted more RAM, you needed the chipset/board without integrated graphics.

By 2005, I would have said that integrated graphics were well-established in basically all entry-level consumer machines and all non-workstationy business machines. So that leaves only gaming and graphics-intensive work applications requiring discrete GPUs, so I would expect a big drop in GPU volumes. That being said, you could still get an optional soldered-on GPU on business laptops.

And you see this in the marketplace - there are far fewer low-end PCI-E graphics cards available for sale down at the computer store. Giant monsters with two-slot coolers and add-in power connectors, sure, but a plain vanilla passively cooled single-slot card is... rare... and most of the models you might find are based on more dated GPU architectures. There would have been a time, maybe in the mid-2000s, where you needed a discrete graphics card to run dual monitors, so that would have created some lower-end need for discrete cards in a business environment, but Intel added dual monitor support to their integrated graphics a looooooong time ago. It wouldn't surprise me if half the low-end cards in the computer store were for enthusiasts building HTPCs who just want a discrete GPU because integrated graphics have a very bad reputation among 30/40-something-year-old enthusiasts.

The other thing, of course, is that GPU lifecycles/prices/etc have gone up. Plenty of gamers upgraded their GPUs on a more frequent cycle in the 2000s, then kept them longer in the 2010s/2020s, so that would also lower volumes.

I'm sure the graph for sound cards would be even worse. In 1995, almost all Windows machines with sound capability had a discrete sound card, by 2000, only higher-end Windows machines had a discrete sound card, and then the redesigned Vista audio architecture in 2007 basically decimated discrete sound cards. Fundamentally, cheap mediocrity tends to dominate all, given enough time...
But it does not negate "good enough" is preferred by most at this point in time.

Which brings us to the 2023 Mac Pro. For 80% of the use case it satisfies it without going beyond the limits of this generation of Mac chips.

By Q1 2025 M3 Ultra/Extreme will address some of the complains of the 2023.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
The point is that, back then, expansion slots in a PC were a must-have for a large proportion of users, including many consumers and most "prosumers"/hobbyists. Since then, a lot of that functionality has migrated first to the motherboard and increasingly - with Apple Silicon - onto the system-on-a-chip.

A lot of on-board interfaces are also "good enough" - e.g. old RS232/parallel, even USB 1 were never really adequate for external drives and any sort of 3D gaming needed a better-than-stock GPU. Any full-screen video editing (even touching up your holiday vids) would likely need additional hardware and a better-than-base-level machine. By contrast, today, USB 3 came out in 2008 and still provides more bandwidth than is needed by any single mechanical hard drive, or even most bog-standard consumer SSDs. Only a minority of users need to pay the premium for even a thunderbolt device - and while a SSD plugged into a 16x PCIe V4 slot sounds awesome there's only so many 4k video streams a person can watch. The cheapest MacBook can run 3D games (maybe at reduced, but still playable, quality levels) and edit your iPhone-shot videos out-of-the-box.
I would add one more factor to this - the rise of Ethernet (note: I am including wifi as part of Ethernet, they're fundamentally the same thing over different physical media) and networked peripherals. 25 years ago, if you needed a home printer, say, it plugged into your computer directly - serial on Mac, parallel on DOS/Windows, barely starting to move to USB. Today, the overwhelming likelihood is that that printer will be connected via a network. Same with scanners.

Similarly, you have lots of affordable NAS boxes that didn't exist 25 years ago... and that would have made no sense 25 years ago when the network speeds were too low. 25 years ago "network storage" at home or small business meant a random computer very slowly sharing one of its drives, if you even had a home network.

The smartphone/tablet world has, if anything, hastened that trend because Ethernet peripherals play much nicer with those devices than directly-attached versions.

Also, the storage world has coalesced towards flash drives and hard drives, both of which connect via USB. Every form of removable media - DVD-R, BD-R, whatever - is now worse on a cost per gigabyte/terabyte basis than hard drives. Laughably so in the case of the last 100GB BD standard. As much as this dinosaur has trouble accepting it, it costs less money to store a gigabyte of data online on my NAS with 2-drive redundancy than it does to burn that data to a disc. Substantially so. So there goes another type of expansion - you're not going to be adding any drives/controllers for random exciting removable media anymore...
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Sure, but what end of the puzzle did those people start on?

I think it's more likely that they started on the M2 end, landed on 8/16/24GB as the appropriate targets for the M2, and then the rest of the lineup doubles itself.

Or do you think they started by saying "oh the M2 Ultra will need 192GB" and then divided each chip by 2 until they landed with 8/16/24 on the base M2?

Just look at the sales volumes - it makes more sense to design the high-volume chips to reflect that market as closely as possible, and if that means that doubling and doubling and doubling doesn't lead to the ideal Ultra chip for a small number of people, so be it.

And ultimately, I'm sure they do not care that their competitors will win by default in the >192GB of RAM, discrete-NVIDIA/AMD GPU-requiring workstation market. Too expensive to adjust their overall product plans to play in that sphere.
macOS's System Report is passively/actively sent to Apple.

They see what 3rd party app devs are doing and seeing what their roadmap for the future like say a decade forward that is refreshed quarterly or annually.

They pay IDC and NPD for their 3rd party market research.

After half a century of past data points they can project where they're heading to for the next decade.

Hence Intel, AMD and Nvidia focusing their R&D on A.I. chips instead of dGPUs on desktop.

At this point in time what is the market of >192GB RAM & better than M2 Ultra dGPUs? Is it large enough for a $3 trillion company to care about?

The R&D resources to beat a i9 & 4090 is better spent on the Vision Pro where real growth industry lies.

Back in 2012 I ordered a BTO'd top-end iMac 27" 22nm. Upgraded the 8GB default to 32GB RAM myself.

If a 2023 iMac 27" 5nm appeared within 4 months I'd buy the base model. As even a 3nm iPhone chip would outperform any 2012 Intel or Nvidia chip.

If Apple allowed for macOS on iPhone when connected to an external Thunderbolt 5 80Gb/s display your late aunt would probably using that. Her concern about 1-2hr/week use would be covered by that iPhone.

The 14 Pro Max has a 2.8K Retina display. That's way better than any display sold when the 2006 C2Q 65nm was sold.
 
Last edited:

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
But it does not negate "good enough" is preferred by most at this point in time.
"Good enough" is preferred by most at most points in time, really. And it's the reason there's been no innovation in printers for 20 years, little innovation in monitors in 10+ years except on the Apple side, etc.

It's all about the economies of scale. If 'good enough' costs $400 and 'better' costs $600, way more people will buy the 'good enough' for $400. So, next generation of 'good enough' costs $300 and next generation of 'better' still costs $600. Repeat this a few generations and the 'good enough' is $75, the 'better' is still $600, and no one is willing to pay $525 more for the better one so it exits the market. Or maybe it becomes more specialized and sticks around at $1200 for some really niche uses.

(See, e.g., SCSI hard drives in the mid-late 1990s. Lots of Mac fans and high-end Windows folks swore about how amazing SCSI drives were, etc. Two years later there were no SCSI drive options left in any Power Mac or non-workstation Intel Windows box - whatever benefits SCSI might still have had, UltraATA 33 or 66 was good enough that not enough people wanted to spend the $1000+ premium for a SCSI controller and drive)

And if a new person has an idea for a better product in that segment, but they would need to price their first generation at $800, well, they don't bother, because they know fully well that their $800 better product will never attract enough people away from the $75 product.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
"Good enough" is preferred by most at most points in time, really. And it's the reason there's been no innovation in printers for 20 years, little innovation in monitors in 10+ years except on the Apple side, etc.

It's all about the economies of scale. If 'good enough' costs $400 and 'better' costs $600, way more people will buy the 'good enough' for $400. So, next generation of 'good enough' costs $300 and next generation of 'better' still costs $600. Repeat this a few generations and the 'good enough' is $75, the 'better' is still $600, and no one is willing to pay $525 more for the better one so it exits the market. Or maybe it becomes more specialized and sticks around at $1200 for some really niche uses.

(See, e.g., SCSI hard drives in the mid-late 1990s. Lots of Mac fans and high-end Windows folks swore about how amazing SCSI drives were, etc. Two years later there were no SCSI drive options left in any Power Mac or non-workstation Intel Windows box - whatever benefits SCSI might still have had, UltraATA 33 or 66 was good enough that not enough people wanted to spend the $1000+ premium for a SCSI controller and drive)

And if a new person has an idea for a better product in that segment, but they would need to price their first generation at $800, well, they don't bother, because they know fully well that their $800 better product will never attract enough people away from the $75 product.
That's why I am grateful that Windows EOL is 122 months and macOS final Security Update is 8-10 years.

After a decade everything's "future tech" at the same price point or inflated adjusted price point.

This 2023 Mac Pro so many complained about. Echo chambers makes it appear it is the end of the world but the data shows that they represent at most 1 of 100 or even 1,000 of all actual users within the past 2 decades.

Just like how Reddit and MR proclaim the iPhone mini as the best iPhone ever and every owner loved it but outside of these places it sold badly. Hence no 14 or 15 mini.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
It's not good news, because it's possible to buy desktop computers that are (in some ways) better than Mac laptops with a similar price. You just can't run macOS on them.

I currently have three computers on my desks in addition to the laptop I have from work: A 2020 iMac with a second monitor and 128 GB RAM for work and casual use; an old gaming PC last upgraded a couple of years ago; and a small NAS with SSDs. I'm not sure what I'm going to replace them with, because Apple's current offerings are not that good. I want something with macOS and enough monitors for casual use. I want a cost-effective computer with as much RAM as reasonably possible for work. I want a Windows PC for gaming. And I'll probably replace the NAS with something running Ubuntu or macOS.
I think your desk is not that different from mine... (well, my NAS doesn't have SSDs)

Fundamentally, this is why I bought the 2020 refurbished iMac when I did. Wanted something with huge amount of RAM to keep web browsers happy (okay, 128GB is a bit of a stretch for that) and a big screen running macOS. Bought a lower-end configuration than I would have bought had I been buying it in 2017, I might add, but it cost barely more than a new studio display.

I think we should just be happy with our Intel iMacs until macOS drops support for them, then we'll see what the lineup looks like. Apple has actually been making a number of pragmatic product decisions recently (the iPhone 14 Plus, the 15" MacBook Air, etc) so I think there's a reasonable chance that by the time new versions of macOS won't support our 2020 Intel iMacs, there will be a passable Apple silicon alternative. Probably not a perfect alternative and we'll be grumpy about the price, but I think there'll be something.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
This 2023 Mac Pro so many complained about. Echo chambers makes it appear it is the end of the world but the data shows that they represent at most 1 of 100 or even 1,000 of all actual users within the past 2 decades.

Just like how Reddit and MR proclaim the iPhone mini as the best iPhone ever and every owner loved it but outside of these places it sold badly. Hence no 14 or 15 mini.
Yes, and that's actually a very good example. My guess is that the iPhone mini fans are old dinosaurs like us with their iMacs at home, etc, who actually have... fairly low demands... for a smartphone. "A phone should be a phone and a computer should have drive bays, expansion slots and a separate monitor!"

Meanwhile, the big screen smartphones sell to the people I was describing, the people who have an old laptop on a shelf somewhere they barely look at each month and who have no interest in spending $1800 on a new MacBook Air, the people who actually need those 30-100GB/month cellular data plans from your friendly neighbourhood carrier, etc. The people for whom the smartphone is the primary non-work computing device.

And the latter group clearly outnumbers the former...
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Yes, and that's actually a very good example. My guess is that the iPhone mini fans are old dinosaurs like us with their iMacs at home, etc, who actually have... fairly low demands... for a smartphone. "A phone should be a phone and a computer should have drive bays, expansion slots and a separate monitor!"

Meanwhile, the big screen smartphones sell to the people I was describing, the people who have an old laptop on a shelf somewhere they barely look at each month and who have no interest in spending $1800 on a new MacBook Air, the people who actually need those 30-100GB/month cellular data plans from your friendly neighbourhood carrier, etc. The people for whom the smartphone is the primary non-work computing device.

And the latter group clearly outnumbers the former...
Steve Jobs famously stated about phablets that "you can't get your hand around it" helps, but that "no one's going to buy that."

Today the most popular phone have similar/same screen sizes as the Pro Max even though it starts at $1099.

The least would be the iPhone mini.

Even in the Android phone space that ships over 1 billion smartphones annually small phones do not sell.

It reminds me of digital camera market that the iPhone & Android destroyed the consumer side.

vFYRnCM.png


Point & shoots are basically dead so let us focus on SLRs & Mirrorless below that are mostly professionals & disposable income hobbyists.

vSZ4ENu.png


World population adjusted, the number of digital cameras shipped in 1999 is equal to 2021-2022 in terms of digital camera per person.

Consumer cameras were absorbed almost entirely by iPhone (20%) & Android (80%).
 
Last edited:

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Steve Jobs famously stated about phablets that "you can't get your hand around it" helps, but that "no one's going to buy that."

Today the most popular phone is the Pro Max even though it starts at $1099.

The least would be the iPhone mini.

Even in the Android phone space that ships over 1 billion smartphones annually small phones do not sell.

It reminds me of digital camera market that the iPhone & Android destroyed the consumer side.



Point & shoots are basically dead so let us focus on SLRs & Mirrorless below that are mostly professionals.
Yup - this is something that Steve Jobs was dead wrong about. Or... maybe he was right at the time he said it, maybe small phones had to come out first for people to then want big phones. Maybe they would have been scared if they had seen big phones first.

I would actually expect big phones to be even more popular in Androidland than iPhoneland, simply because I'm sure there are a LOT of people in Androidland for whom the phone is the primary computing device.

And it's funny, my parents were early converts to the big phone trend, getting 6 Pluses. I was convinced smaller was better until... I ended up having my parents' 6 Pluses to sell them, then I realized bigger made sense and replaced my X with an 11 Pro Max. I think big phones are just one of those things that old-fashioned computer guys who grew up in the 1990s don't come naturally to...

Oh, digital cameras. Another interesting example - you bought 2-3 digital cameras in the early-mid 2000s because, well, the 2004 model was dramatically better than the 2000 model. Then... innovation slowed down, then smartphones got Good Enough and ate the consumer digital camera market. I still have the last digital camera I bought in a drawer - I don't think there's a big improvement in today's models. Doesn't help that I have never seen a good implementation of a wifi-capable digital camera - Eye-Fi cards were good, but those died, and the built-in wifi on the cameras I've seen was dreadfully clunky to transfer one or two photos to a computer. Just dreadful if, say, you wanted to take a photo of something to list it for sale or send to a friend or something.

I don't know anything about pro-market cameras but I would presume innovation has slowed down there too, so professionals are not replacing their cameras as frequently as they would have been in the 2000s. But... one thing I noticed that shocked me - it seems that vloggers and YouTube folks, etc are actually using these kinds of cameras for video rather than still images, so maybe Canon/Nikon/Sony have invented a new market there... at the expense of destroying the traditional camcorder (which has been destroyed, hasn't it?).
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
Like I said, the Pro market is a lot larger than just the people using a 2019 Mac Pro who have no clear upgrade route (which is likely a very small number). Likewise, I too can argue that the Pro Mac market isn't representative of Apple's total user base either.

I would also like to direct you to this pretty good explainer on the grand theory of Apple, which I find makes a pretty good argument for AR glasses, while also dropping desktop Macs. It's pretty prescient, considering it was released way before Apple Silicon.


Apple's whole selling point with their custom silicon was always about power efficiency without sacrificing performance, and I will say they have more than met that goal. For example, their 14" and 16" MBPs allow for sustained performance over a long period of time even when not plugged in to an external power source, something you don't see in windows laptops (a lot of laptops with powerful graphics cards tend to throttle quickly). The Mac Studio easily takes up a fraction of the space of an equivalent Windows desktop. The M2 Pro Mac Mini is also a fairy powerful computer at a reasonably affordable price. I will say that everything from the M1 MBA to the M2 Mac Studio pretty much covers the computing needs of over 95% of Apple's Mac user base (the Mac Pro is likely a fraction of a percentage here).

I briefly skimmed through the thread and I am not sure why you would expect Apple to do any of what you are asking for. You want some sort of custom M2 Extreme chip which would only be shipped in the Mac Pro, inside a brand new Mac Pro enclosure, and I am sure you will all still want it to be modular in some way (ie: able to add ram and storage without having to pay Apple's premium), yet not be too expensive, and supported for at least 10 years. All for a fairly low volume product that may never earn Apple back whatever resources was pumped into it.

And honestly speaking, were I running Apple, I would never even have released the 2019 Mac Pro.
Such a poor logic and excuse. I wouldn't waste my time on that.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Yup - this is something that Steve Jobs was dead wrong about. Or... maybe he was right at the time he said it, maybe small phones had to come out first for people to then want big phones. Maybe they would have been scared if they had seen big phones first.

I would actually expect big phones to be even more popular in Androidland than iPhoneland, simply because I'm sure there are a LOT of people in Androidland for whom the phone is the primary computing device.

And it's funny, my parents were early converts to the big phone trend, getting 6 Pluses. I was convinced smaller was better until... I ended up having my parents' 6 Pluses to sell them, then I realized bigger made sense and replaced my X with an 11 Pro Max. I think big phones are just one of those things that old-fashioned computer guys who grew up in the 1990s don't come naturally to...
Big phone have bigger batteries. If you charge your smartphone less frequently than previously then the batter life of the original one doubles or even triples.

Say if you have to charge daily with 2011 iPhone 4S that requires a battery replacement by year 2-3. Then by 2023 iPhone 15 Pro Max has you charging every 0.5 week. Then you effectively have a phone that does not need a new battery until EOL in 7-8 years time.

Small phones cannot do that. Not to mention the screens are too small for the Internet.
I don't know anything about pro-market cameras but I would presume innovation has slowed down there too, so professionals are not replacing their cameras as frequently as they would have been in the 2000s. But... one thing I noticed that shocked me - it seems that vloggers and YouTube folks, etc are actually using these kinds of cameras for video rather than still images, so maybe Canon/Nikon/Sony have invented a new market there... at the expense of destroying the traditional camcorder (which has been destroyed, hasn't it?).
Replacement cycle of pro cameras have been relatively the same. Every 1/2 decade or whenever a major sporting event like the Summer Games occurs.

This is when bulk orders from photo news agencies like AP, Reuters, EPA, etc come about.

These are the companies that used to buy Mac Pros then more likely Mac Studios recently. They're the ones issuing MBP and MBAs to their photonews journalists.

Since the slowdown they redirected their efforts to cater to a growth market that is vloggers, YouTubers, etc.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Still, they don't use SoC as a main stream just like others. Do they even put external GPU on it? Not at all.
Almost all Intel mainstream processors have had on-processor graphics for about a decade...

There are weird SKUs with no graphics, and I don't think the Xeons have graphics, but everything else, including your gamer-special i7-xx700K does.
 
  • Haha
Reactions: sunny5

Abazigal

Contributor
Jul 18, 2011
20,392
23,890
Singapore
Yes, and that's actually a very good example. My guess is that the iPhone mini fans are old dinosaurs like us with their iMacs at home, etc, who actually have... fairly low demands... for a smartphone. "A phone should be a phone and a computer should have drive bays, expansion slots and a separate monitor!"

Meanwhile, the big screen smartphones sell to the people I was describing, the people who have an old laptop on a shelf somewhere they barely look at each month and who have no interest in spending $1800 on a new MacBook Air, the people who actually need those 30-100GB/month cellular data plans from your friendly neighbourhood carrier, etc. The people for whom the smartphone is the primary non-work computing device.

And the latter group clearly outnumbers the former...
Christian Selig, the dev of the Apollo reddit app, shared that a fair number of people using his app are iPhone mini owners, suggesting that the iPhone mini is still somewhat popular amongst the more tech savvy users.

In contrast, his other app, pixel pals, sees way fewer iPhone mini users, but more iPhone plus users.

This suggests that iPhone mini owners tend to be more tech savvy and well-off, because they usually can afford multiple computing devices (eg: a smaller smartphone display is less of an issue when you also have an iPad). Conversely, larger iPhones tend to be more popular amongst people for whom it is their main computing device.

That's probably why we see quite a number of people clamouring for a smaller iPhone here on this forum, yet sales did not seem to live up to the hype, because the people here tend to be more tech savvy and well-off financially, and are not necessarily representative of Apple's overall user base.
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
Almost all Intel mainstream processors have had on-processor graphics for about a decade...

There are weird SKUs with no graphics, and I don't think the Xeons have graphics, but everything else, including your gamer-special i7-xx700K does.
And do they even use iGPU as a main GPU or replace external GPU? Seriously?
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Big phone have bigger batteries. If you charge your smartphone less frequently than previously then the batter life of the original one doubles or even triples.

Say if you have to charge daily with 2011 iPhone 4S that requires a battery replacement by year 2-3. Then by 2023 iPhone 15 Pro Max has you charging every 0.5 week. Then you effectively have a phone that does not need a new battery until EOL in 7-8 years time.

Small phones cannot do that. Not to mention the screens are too small for the Internet.
I agree, except I think 'too small for the Internet' is probably the driving factor more so than the battery life.

To the surprise of some industry participants (e.g. BlackBerry), people who use their smartphone intensively for the Internet have been quite happy to put up with rather disappointing battery life.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
To the surprise of some industry participants (e.g. BlackBerry), people who use their smartphone intensively for the Internet have been quite happy to put up with rather disappointing battery life.
Business users who need email.

iPhone did the full desktop web and not 2G WAP... not the song.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
And do they even use iGPU as a main GPU or replace external GPU? Seriously?
Hard to tell - obviously, most of those processors that land in gaming/enthusiast machines will have discrete GPUs and the iGPU is possibly never ever used.

But if you looked at, say, all i7 desktops - who knows how many of those don't get a discrete GPU? I presume it must be a significant number, otherwise Intel would have removed the on-processor GPU from the higher-spec processors...

I thought it might have been different in AMD land, but it's not - a Ryzen 7900X3D, for example, also has on-CPU graphics.

I will say, having had to do some GPU troubleshooting on a vintage C2Q machine a little while ago, having on-CPU graphics available is not a bad thing, if only for troubleshooting and emergencies.
 
  • Haha
Reactions: sunny5

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
This suggests that iPhone mini owners tend to be more tech savvy and well-off, because they usually can afford multiple computing devices (eg: a smaller smartphone display is less of an issue when you also have an iPad). Conversely, larger iPhones tend to be more popular amongst people for whom it is their main computing device.

That's probably why we see quite a number of people clamouring for a smaller iPhone here on this forum, yet sales did not seem to live up to the hype, because the people here tend to be more tech savvy and well-off financially, and are not necessarily representative of Apple's overall user base.
I am a bit hesitant to tie this too much to income, because I'm sure there are plenty of high-earning late-20-somethings who haven't seriously touched a personal laptop since graduating from school...

... but otherwise, I think that's right. Older people with entrenched workflows and attitudes are less likely to adopt the new device type for as many tasks. Hell, I still mostly use a land line for most of my phone calls... which is a totally early-2000s attitude. And just like I have difficulty imagining a cell-phone-only world, I have difficulty imagining a laptop-only world, when in reality, I don't actually 'need' my iMac...
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
I feel like we are going around in circles at this point, but in a way, I miss the days of the 3-4 year replacement cycle because those replacement cycles were driven by actual need and cool software.

You replaced your 4 year old hardware, got the new version of software X, and not only did it run faster than the old version on your old system, but it had useful innovative features you actually used. And you also got some additional badly needed storage space, because, well, you had filled up your hard drive in less than 4 years.

Today, the biggest performance-related reason to replace a system is that it doesn't have enough RAM to feed all the modern Electron monstrosities! Or enough CPU to run whatever crazy JavaScript is happening in a browser. Or maybe gaming, but I am not a heavy gamer...

Now, you replace your 7 year old hardware because the battery is swelling and you can't get a replacement or because your OS vendor decided that you can't officially run a reasonable current OS on it.

Sure, I'm getting at least 3 more years out of that hardware, but... where's the excitement unboxing something whose most perceptible improvement is a new set of security patches? The excitement seems to have transferred to unboxing smartphones, and even then, there is less excitement unboxing a 14 Pro Max than unboxing, oh, a 5 or a X...
I think SSDs were TOO good when it introduced, brought a lot of life back to even a 2010 computer in today's day. But also Intel stalled out in the early 2010s with their processors. I did not experience as big of a jump as I thought I would going from a mid range 2010 Mac Pro to the 2019 i9 iMac. It barely moved the needle in saving me time exporting my videos. And that was a 9 year upgrade. I DID however gain the ability to export in HEVC which was a plus, but the performance was not very much better.

Contrast that to even the base M1 Mac mini at launch, that thing just beat the socks off my 2019 i9 iMac. It was ridiculous how the Mini beat my $5,000 purchase.

I do think people hold on to their computers a bit too much. Why upgrade from a 2012 computer? Why upgrade from a 2011 Office? Security is a big part of it. The other part of this is of course support. What will these people do if their 2012 Mac suddenly dies or needs support or their 2011 Office experiences a bug or if Office 365 introduces a security feature that requires a more modern Office version (I recall it was even a mess to get Office 365 accounts added to the 2010 version of Office for Windows, you had to configure a bit of advanced settings for it to work). Apple and Microsoft more than likely won't help now. I make it a rule to NEVER run outdated software or hardware as my primary device.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
I am a bit hesitant to tie this too much to income, because I'm sure there are plenty of high-earning late-20-somethings who haven't seriously touched a personal laptop since graduating from school...

... but otherwise, I think that's right. Older people with entrenched workflows and attitudes are less likely to adopt the new device type for as many tasks. Hell, I still mostly use a land line for most of my phone calls... which is a totally early-2000s attitude. And just like I have difficulty imagining a cell-phone-only world, I have difficulty imagining a laptop-only world, when in reality, I don't actually 'need' my iMac...
It is totally related to income, work industry and non-work computing needs.

If you're low income in a poor nation you tend to buy a China brand sub-$150 Android phone every 2-3 years & ~$799 laptop every 5-6 years.

TV is replaced every decade or longer that is connected to an free-to-air TV antenna without cable or streaming subscription. It would not be surprising that they may still be using a CRT fro 3+ decades ago.

Desktops are old fashioned that may be present at your workplace. Largely unchanged for a decade or so depending on the industry. Longer replacement cycle tend to be followed in non-tech firms.

If you have a WFH arrangement or unaltered workflow then you may have an AIO or desktop like a 2023 Mac Pro.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.