Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Disappointed with Mac Pro 2023?


  • Total voters
    534

Abazigal

Contributor
Jul 18, 2011
20,392
23,890
Singapore
What you are saying is totally meaningless. It's a fact that they abandoned a lot of pro markets and their ability to make chips are proven to be limited. M2 Ultra on Mac Pro? Really?

You are wasting my time on obvious issues.
The Mac Pro serves a niche of a niche, and apart from that, there are tons of creatives whom I imagine would be well-served from Apple's existing lineup of Macs, which do offer a range of benefits you don't see with Windows computers. In the larger scheme of things, I am fine with Apple giving up on the Mac Pro if it means they have more time and resources to devote to their other product lines.
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
The Mac Pro serves a niche of a niche, and apart from that, there are tons of creatives whom I imagine would be well-served from Apple's existing lineup of Macs, which do offer a range of benefits you don't see with Windows computers. In the larger scheme of things, I am fine with Apple giving up on the Mac Pro if it means they have more time and resources to devote to their other product lines.
And you are justifying ditching and reducing the pro market for what? Beside, you don't represent all of us.
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,605
4,113
The Mac Pro serves a niche of a niche, and apart from that, there are tons of creatives whom I imagine would be well-served from Apple's existing lineup of Macs, which do offer a range of benefits you don't see with Windows computers. In the larger scheme of things, I am fine with Apple giving up on the Mac Pro if it means they have more time and resources to devote to their other product lines.
It’s an emotional topic for handful of few, who felt dumped by Apple. It’s like a jilted Ex who will never see Apple in positive light.
 
  • Haha
Reactions: Abazigal

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
My friend complains that his cheap laptop's display hurts his eyes. I looked it over and it has a TN panel. So I understand why he hates using it. The replacement he got still has a TN panel.

Both he bought from a brick & mortar store in a mall because he wanted the salesman to fix it up for him.

I told him to get a MBA M1 for the better user experience but he fears that his 3 dozen cats may pee on the keybaord and cause costly repairs.

I'd forgive his mum or your late aunt for being computer illiterate but a mid 40s male not know how to use a computer? In what barn was he raised in?
One thing I like about Apple is that they have a good support network in the stores for less tech-savvy folks... although my dad didn't have the greatest experience with a moody Bluetooth mouse. But still, miles ahead of WB's Geek Squad or similar options in Windowsland.

Another thing that I like about Apple is that Apple largely doesn't sell junk. You won't find a stupid elcheapo Atom-based processor that's slower than a 10 year old C2D, you won't find a lousy wifi card, you won't find (except maybe the non-retina MBA?) a lousy LCD panel, you won't find a bad quality trackpad, you won't find a slow hard drive, you won't find eMMC storage, you won't find 100 megabit Ethernet (yes, Dell had consumer laptops with 100 megabit... in 2017 or 2018), etc in a Mac. That's why the entry price point for a Mac is much higher, but unlike a Windows laptop where you need to pore over the spec sheet because $20 more could get you a much better screen or wifi card or whatever, you can be confident that a Mac is not junky in those little ways. Apple can sometimes be a bit too low-end, e.g. not giving you enough RAM or a small SSD in the base config, but they're not junky.

As I said earlier, I work in a non-technical field and I can tell you that youth does not mean computer skills or comfort with technology. Puzzles me too - you'd think making it through post-secondary education in the 2010s would require at least some computer skills, but some folks seem less tech savvy than my 70-something dad.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Puzzles me too - you'd think making it through post-secondary education in the 2010s would require at least some computer skills, but some folks seem less tech savvy than my 70-something dad.
His folks never worked in a company much less a multinational that required their personnel to know how to use computers in the 70s & 80s. So a PC prior to 2000 was unnecessary expense.

He even flunked auto design school in Coventry because he was computer illiterate.

Don't sweat the small stuff that other people. Focus on yourself and how to improve your situation.

Which includes the overly obsession with Mac/PCs being abandoned after a decade.

We came from a decade that used to have a schedueld 3 year replacement cycle. A decade's use is a revelation!
 

azentropy

macrumors 601
Jul 19, 2002
4,136
5,664
Surprise
Well it has been for ever since the Mac Pro was for me, so this just follows that recent trend.
I miss the days of a user expandable and user upgradable affordable system. People seem to forget that the Mac Pro used to start at under $2K and you used to be able to gradually upgrade it with better video, memory, storage and even processors. And before people use the "well adjusting for inflation" excuse, technology and computers typically buck the trend of inflation.
 
  • Like
Reactions: VivienM

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Anecdotal evidence is an unknown, I don't know what the reporter did prior and it could have been all their fault. Something widely reported I trust, a few reports, not really. What I run into, now I trust that always.

And no, I just don't find a 4 year old machine not getting the newest OS as insulting. I'd just keep running Windows 10 on it.
The SSE2 thing is at least documented - see https://www.ghacks.net/2018/06/21/windows-7-support-dropped-for-cpus-without-sse2/ . I think there is a knowledge base article saying 'upgrade to a SSE2 processor'. The AGP thing is less clear because, well, how many people other than dual-booting Win98 fans would be running an AGP system in 2016/2017? Most mid-high-end Windows machines were dropping AGP in 2004.

In 28 years of having Windows machines, I cannot think of any other situation where a <4 year old decent machine was flat out told 'no' by Microsoft. Maybe Windows 98 didn't run on my non-Intel elcheapo 1995 "486" machine (yes they wanted a 66MHz while that thing was only a 50MHz), but that machine was so tired running 95 in 1998 that it doesn't really matter. I can think of a few machines where it might have been unreasonable for me to install the new OS on it, but none where Microsoft said flat out no. Their hardware requirements always used to be half of what you actually needed for passable performance, anyways.

But here, you have a situation where a low-end 2018 machine is fine and a high-end 2017 machine with 3-4X the CPU performance, 4-16X the RAM, who knows how much more storage performance, etc is not.

Example: the J4005 Celeron is supported. Single-core performance, 370, multicore 650 in Geekbench. That's... worse than my late aunt's C2Q 8400 (350 single core, 950 multi-core, roughly) which is a decade older. Meanwhile my i7 7700 gets 1490/4800. So my unsupported system has 4X the CPU performance, all the TPMs/secure boot/UEFIs/etc they might want, 16X the RAM, I don't how much faster storage, etc and it's not good enough for them.

First time in 28 years of Windows that I've seen something like that... and it spooks me for Windows 12 too. Let's say I go and spend CAD$2000-2500 on a new Windows desktop today - how do I know my shiny new Ryzen 79xx or i7-13xxx won't be on the wrong side of some new arbitrary cutoff for Windows 12? Also, as much as we are saying 4 years, I think the cut-off was less on the AMD side - Zen 2 or newer, they say.

And I did continue running Windows 10. And bought two more new (well, one was an Apple refurb) Macs. :)
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Which includes the overly obsession with Mac/PCs being abandoned after a decade.

We came from a decade that used to have a schedueld 3 year replacement cycle. A decade's use is a revelation!
I feel like we are going around in circles at this point, but in a way, I miss the days of the 3-4 year replacement cycle because those replacement cycles were driven by actual need and cool software.

You replaced your 4 year old hardware, got the new version of software X, and not only did it run faster than the old version on your old system, but it had useful innovative features you actually used. And you also got some additional badly needed storage space, because, well, you had filled up your hard drive in less than 4 years.

Today, the biggest performance-related reason to replace a system is that it doesn't have enough RAM to feed all the modern Electron monstrosities! Or enough CPU to run whatever crazy JavaScript is happening in a browser. Or maybe gaming, but I am not a heavy gamer...

Now, you replace your 7 year old hardware because the battery is swelling and you can't get a replacement or because your OS vendor decided that you can't officially run a reasonable current OS on it.

Sure, I'm getting at least 3 more years out of that hardware, but... where's the excitement unboxing something whose most perceptible improvement is a new set of security patches? The excitement seems to have transferred to unboxing smartphones, and even then, there is less excitement unboxing a 14 Pro Max than unboxing, oh, a 5 or a X...
 
  • Like
Reactions: Ethosik

Abazigal

Contributor
Jul 18, 2011
20,392
23,890
Singapore
And you are justifying ditching and reducing the pro market for what? Beside, you don't represent all of us.
Like I said, the Pro market is a lot larger than just the people using a 2019 Mac Pro who have no clear upgrade route (which is likely a very small number). Likewise, I too can argue that the Pro Mac market isn't representative of Apple's total user base either.

I would also like to direct you to this pretty good explainer on the grand theory of Apple, which I find makes a pretty good argument for AR glasses, while also dropping desktop Macs. It's pretty prescient, considering it was released way before Apple Silicon.


Apple's whole selling point with their custom silicon was always about power efficiency without sacrificing performance, and I will say they have more than met that goal. For example, their 14" and 16" MBPs allow for sustained performance over a long period of time even when not plugged in to an external power source, something you don't see in windows laptops (a lot of laptops with powerful graphics cards tend to throttle quickly). The Mac Studio easily takes up a fraction of the space of an equivalent Windows desktop. The M2 Pro Mac Mini is also a fairy powerful computer at a reasonably affordable price. I will say that everything from the M1 MBA to the M2 Mac Studio pretty much covers the computing needs of over 95% of Apple's Mac user base (the Mac Pro is likely a fraction of a percentage here).

I briefly skimmed through the thread and I am not sure why you would expect Apple to do any of what you are asking for. You want some sort of custom M2 Extreme chip which would only be shipped in the Mac Pro, inside a brand new Mac Pro enclosure, and I am sure you will all still want it to be modular in some way (ie: able to add ram and storage without having to pay Apple's premium), yet not be too expensive, and supported for at least 10 years. All for a fairly low volume product that may never earn Apple back whatever resources was pumped into it.

And honestly speaking, were I running Apple, I would never even have released the 2019 Mac Pro.
 
  • Haha
Reactions: sunny5

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
It’s an emotional topic for handful of few, who felt dumped by Apple. It’s like a jilted Ex who will never see Apple in positive light.
I think it's a bit broader than feeling dumped by Apple...

There's an entire generation of old techies who grew up with modular desktop machines. Either Mac or especially Windows. You got your minitower machine, added some RAM, added an expansion card for some new connectivity technologies, maybe added a processor upgrade (those were more common in the Mac world than in the Windows world), added a hard drive or two, upgraded the GPU, etc. Kept it going for 3-5 years, spent lots of time and very hard-earned money (because, well, younger folks made a lot less money) upgrading it, then finally finally managed to replace it with another.

In Windowsland, other than home-built machines, the best example of this were the built-to-order machines that made Dell/Gateway in the late 1990s. Intel motherboard with no onboard anything beyond IDE/parallel/serial, you got a discrete graphics card, a discrete sound card, a discrete modem, a discrete network card pre-installed in your PCI slots. Those machines look so weird today, coming from the factory with potentially 4-5 slots used. You could pick one of several selections for all of those components (all fairly high quality), your storage, etc, they'd assemble it and ship it to you, and you could just replace any of those components later if you wanted.

In Macland, I would probably guess (not having been on the Mac side at the time) that the peak of this era was the B&W G3 and the G4s. Affordable entry price, no built-in video, lots of PCI expansion options, a world that was moving fast with new connectivity technologies and new drive types, etc. The G5 started to up the price and move more workstationy.

And the last gasp of that type of machine was the 2010 Mac Pro. A little too expensive and workstationy, but it was still a modular half-affordable desktop in a way that nothing that followed was. The trash can, by contrast, previewed the philosophy that has now given us Apple silicon.

As much as it pains me to say it, that era is dead. Even Windows gaming land has abandoned modular machines - most modern gaming cases have no drive bays, few open PCI-E slots except the GPU's, etc. Sure, this is a little more modular than the modern Mac, but... not that much. Not compared to what it was two decades ago.
 
  • Like
Reactions: theluggage

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
The SSE2 thing is at least documented - see https://www.ghacks.net/2018/06/21/windows-7-support-dropped-for-cpus-without-sse2/ . I think there is a knowledge base article saying 'upgrade to a SSE2 processor'. The AGP thing is less clear because, well, how many people other than dual-booting Win98 fans would be running an AGP system in 2016/2017? Most mid-high-end Windows machines were dropping AGP in 2004.
Oh I trust that that one happened, but I never ran in to it, maybe luck, maybe I just didn't get any AGP systems and keep them for long, who knows. I barely even remember AGP, but graphics hasn't been a main concern of where I work. If I had ran in to it, I'd have reinstalled the older version of the OS, turn off auto updates, and just let it continue being used. If it was totally bricked, I'd move another PC there. It isn't totally uncommon for a PC to die and have to be replaced. Some I fix, some I put in the bone pile.
First time in 28 years of Windows that I've seen something like that... and it spooks me for Windows 12 too. Let's say I go and spend CAD$2000-2500 on a new Windows desktop today - how do I know my shiny new Ryzen 79xx or i7-13xxx won't be on the wrong side of some new arbitrary cutoff for Windows 12? Also, as much as we are saying 4 years, I think the cut-off was less on the AMD side - Zen 2 or newer, they say.
You don't, and there's one of the big differences between us -- I wouldn't expect it to. If it did, I may update it, but that isn't even assured, if it isn't it still runs the same stuff it always has. :). I can't be insulted by something that I don't expect.

now if they sold me a machine that was supposed to come with the latest OS and didn't, and wouldn't run the latest, I'd blame the vendor, not Microsoft. (or myself if it was a home built machine)
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,605
4,113
I think it's a bit broader than feeling dumped by Apple...

There's an entire generation of old techies who grew up with modular desktop machines. Either Mac or especially Windows. You got your minitower machine, added some RAM, added an expansion card for some new connectivity technologies, maybe added a processor upgrade (those were more common in the Mac world than in the Windows world), added a hard drive or two, upgraded the GPU, etc. Kept it going for 3-5 years, spent lots of time and very hard-earned money (because, well, younger folks made a lot less money) upgrading it, then finally finally managed to replace it with another.

In Windowsland, other than home-built machines, the best example of this were the built-to-order machines that made Dell/Gateway in the late 1990s. Intel motherboard with no onboard anything beyond IDE/parallel/serial, you got a discrete graphics card, a discrete sound card, a discrete modem, a discrete network card pre-installed in your PCI slots. Those machines look so weird today, coming from the factory with potentially 4-5 slots used. You could pick one of several selections for all of those components (all fairly high quality), your storage, etc, they'd assemble it and ship it to you, and you could just replace any of those components later if you wanted.

In Macland, I would probably guess (not having been on the Mac side at the time) that the peak of this era was the B&W G3 and the G4s. Affordable entry price, no built-in video, lots of PCI expansion options, a world that was moving fast with new connectivity technologies and new drive types, etc. The G5 started to up the price and move more workstationy.

And the last gasp of that type of machine was the 2010 Mac Pro. A little too expensive and workstationy, but it was still a modular half-affordable desktop in a way that nothing that followed was. The trash can, by contrast, previewed the philosophy that has now given us Apple silicon.

As much as it pains me to say it, that era is dead. Even Windows gaming land has abandoned modular machines - most modern gaming cases have no drive bays, few open PCI-E slots except the GPU's, etc. Sure, this is a little more modular than the modern Mac, but... not that much. Not compared to what it was two decades ago.
My last mac pro was 2012 model, I like modularity in my work stations. I replaced it with maxed out RAM, storage AMD threadripper with GPU 3090 in 2020. I upgraded GPU to 4090 last year. The 4090 may be fast but 24 GB VRAM is pathetic. I run out of GPU memory for most of my large modals. My 64GB M1 Max with unified memory runs those models slower but successfully. I love the direction Apple is going with unified memory at cost of upgradability. I can’t wait for future generations of Apple silicon with 256-512 GB unified memory and more dedicated tensor cores and powerful GPU.
Bottom line, the computers are tools, no need to get emotional. If Apple doesn’t meet your needs, buy something else that does.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
You don't, and there's one of the big differences between us -- I wouldn't expect it to. If it did, I may update it, but that isn't even assured, if it isn't it still runs the same stuff it always has. :). I can't be insulted by something that I don't expect.
And that's the thing... my i7-7700 replaced a C2Q that should have been replaced a little while earlier but wasn't due to the whole Windows 8/8.1 thing really questioning my faith in Microsoft. I needed a new laptop, bought the mid-2014 MBP instead of a Windows laptop, and was eyeing an iMac on the desktop side, but as it turns out, Windows 10 came along and convinced me that Microsoft was back on a half-reasonable path after the ridiculous touch-craziness of 8/8.1, so didn't end up ordering a souped up 27" iMac.

My C2Q was overdue for replacement in early 2017 when I built the i7-7700. It had been assembled in multiple steps, I think I had originally bought the motherboard in late 2008, changed the processor later, etc, but I think that machine had spent about 7 years as my main desktop. I think primarily running Windows 7 - by the time 10 came out, I felt the machine was a bit lacking in RAM, was mostly using the MBP anyways and never seriously looked at upgrading it.

So, having seen the C2Q last me 7-8ish years and with the progress of technology slowing, when I built the i7-7700, I picked all my parts trying to aim for a ten-year lifecycle. Even spent extra money on a motherboard with Thunderbolt 3 because I thought, oh, there's a good chance that this Thunderbolt thing might become a thing in Windowsland in the next ten years.

The one thing I somewhat cheaped out on was the GPU because I didn't love my options in 2017 and I figured that machine would go through multiple GPUs in the ten-year-lifecycle - ended up replacing my 1060 with a 3070 in late 2020.

Four years into my expected 10-year lifecycle and six months after adding the expensive new GPU, Microsoft tells me that that machine doesn't meet their "performance and reliability expectations" for Windows 11, while that low-end Celeron one year newer does. Security updates only on Windows 10 for four more years, then e-waste (or a non-Windows OS).

It absolutely never, ever, ever occurred to me in 2017 as I was picking my parts that Microsoft could ever impose an arbitrary age-based restriction on OS support. I might have thought the lowest-end CPU on the slowest laptop in the computer store was a lousy bet for a long life. I can tell you that at the very least, I would probably not have bought the Thunderbolt 3 motherboard and might have even gone down to an i5-7500 if I could have ever imagined that my machine could be turned into a second-class citizen OS-wise four years later. But I simply assumed that the end of that machine's life would come either i) if it didn't support some peripheral/connectivity that I needed and couldn't add via PCI-E/USB/TB3, or ii) if it became too slow for what I wanted to run on it. (Or, I suppose, hardware failure, but I've never had a Windows desktop fry its motherboard and everything other than motherboard/CPU is replaceable) So I spent good money on what I thought would delay those two things...

And yes, now, even if I did want to build another Windows machine tomorrow, I sure don't know what I would do or pick. Probably something more focused on immediate needs.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
Apple's whole selling point with their custom silicon was always about power efficiency without sacrificing performance, and I will say they have more than met that goal. For example, their 14" and 16" MBPs allow for sustained performance over a long period of time even when not plugged in to an external power source, something you don't see in windows laptops (a lot of laptops with powerful graphics cards tend to throttle quickly). The Mac Studio easily takes up a fraction of the space of an equivalent Windows desktop. The M2 Pro Mac Mini is also a fairy powerful computer at a reasonably affordable price.
I have a different perspective. Mac laptops are now really good. In many market segments, buying a Mac should be the default choice, and you should buy something else only if you have specific reasons for that.

Mac desktops, on the other hand, are underwhelming. Apple doesn't have a satisfying answer to the basic question: If you already have a Mac laptop, why should you buy a Mac desktop? What more do you get for the same price if you give up mobility? Higher performance with the same hardware? Higher RAM/storage capacity? More ports? Internal expansion capacity? The only desktops with a clear advantage over the laptops are the expensive high-end models with M1/M2 Ultra.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Mac desktops, on the other hand, are underwhelming. Apple doesn't have a satisfying answer to the basic question: If you already have a Mac laptop, why should you buy a Mac desktop? What more do you get for the same price if you give up mobility? Higher performance with the same hardware? Higher RAM/storage capacity? More ports? Internal expansion capacity? The only desktops with a clear advantage over the laptops are the expensive high-end models with M1/M2 Ultra.
Maybe the answer is that... for the overwhelming majority of people, you shouldn't buy a Mac desktop if you already have a Mac laptop?

To the extent you want a bigger screen, get a Studio Display or any of the previous Thunderbolt displays.

If anything, it seems like Apple's entire strategy for... twenty years... has been to shrink any advantages the desktops have over the laptops. Apple Silicon is basically the last step in that strategy since, at least the way they've implemented things, there isn't really any advantage to M2 Max in a desktop vs M2 Max in a laptop.

I haven't been in an Apple Store in a while, but my recollection is that they also devote a lot more floor space to the laptops than to the desktops. There's still a market for the desktops, but the laptops should be seen as being the default Mac, as they've been since... the Intel MacBooks, probably? And I don't think they want to sell most people both a laptop and a desktop, not in 2023 - in 2004, sure, maybe they wanted to sell you a Power Mac G5 and a 12" aluminum G4, but in 2023?
 

thebart

macrumors 6502a
Feb 19, 2023
515
518
Basically the only thing the base and pro mini have over laptops is no battery to swell up and die on you.
 

Abazigal

Contributor
Jul 18, 2011
20,392
23,890
Singapore
I have a different perspective. Mac laptops are now really good. In many market segments, buying a Mac should be the default choice, and you should buy something else only if you have specific reasons for that.

Mac desktops, on the other hand, are underwhelming. Apple doesn't have a satisfying answer to the basic question: If you already have a Mac laptop, why should you buy a Mac desktop? What more do you get for the same price if you give up mobility? Higher performance with the same hardware? Higher RAM/storage capacity? More ports? Internal expansion capacity? The only desktops with a clear advantage over the laptops are the expensive high-end models with M1/M2 Ultra.


The key advantage of a a Mac desktop over a laptop so far would be price but otherwise, isn’t that good news? Mac Laptops are good enough that I can get my work done on a MBP and don’t need a separate desktop for the heavy lifting. In contrast, I imagine I would get a Mac Studio if I work primarily from home and need the M2 ultra, or I could get a MBA for light productivity on the go.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
The key advantage of a a Mac desktop over a laptop so far would be price but otherwise, isn’t that good news? Mac Laptops are good enough that I can get my work done on a MBP and don’t need a separate desktop for the heavy lifting. In contrast, I imagine I would get a Mac Studio if I work primarily from home and need the M2 ultra, or I could get a MBA for light productivity on the go.
It's not good news, because it's possible to buy desktop computers that are (in some ways) better than Mac laptops with a similar price. You just can't run macOS on them.

I currently have three computers on my desks in addition to the laptop I have from work: A 2020 iMac with a second monitor and 128 GB RAM for work and casual use; an old gaming PC last upgraded a couple of years ago; and a small NAS with SSDs. I'm not sure what I'm going to replace them with, because Apple's current offerings are not that good. I want something with macOS and enough monitors for casual use. I want a cost-effective computer with as much RAM as reasonably possible for work. I want a Windows PC for gaming. And I'll probably replace the NAS with something running Ubuntu or macOS.

It's surprisingly difficult to buy computers when you have more than one or two use cases to cover.
 
  • Like
Reactions: VivienM

Longplays

Suspended
May 30, 2023
1,308
1,158
I feel like we are going around in circles at this point, but in a way, I miss the days of the 3-4 year replacement cycle because those replacement cycles were driven by actual need and cool software.

You replaced your 4 year old hardware, got the new version of software X, and not only did it run faster than the old version on your old system, but it had useful innovative features you actually used. And you also got some additional badly needed storage space, because, well, you had filled up your hard drive in less than 4 years.

Today, the biggest performance-related reason to replace a system is that it doesn't have enough RAM to feed all the modern Electron monstrosities! Or enough CPU to run whatever crazy JavaScript is happening in a browser. Or maybe gaming, but I am not a heavy gamer...

Now, you replace your 7 year old hardware because the battery is swelling and you can't get a replacement or because your OS vendor decided that you can't officially run a reasonable current OS on it.

Sure, I'm getting at least 3 more years out of that hardware, but... where's the excitement unboxing something whose most perceptible improvement is a new set of security patches? The excitement seems to have transferred to unboxing smartphones, and even then, there is less excitement unboxing a 14 Pro Max than unboxing, oh, a 5 or a X...
Replacing my Macs after its final Security Update in a decade looks like this

- 2011 MBP 13" 32nm > 2021 MBP 16" 5nm
- 2012 iMac 27" 22nm > 2023(?) iMac 27" 5nm

That's the exciting part... the decade's use when support ends results in a new computer that is good for the next decade. It has a great improvement in process node, raw performance, performance per watt, battery life, power consumption, thermals, operational noise, industrial design, OS feature/security updates & fresh new hardware free of 10 years of physical wear, tear, dust accumulation, scratches, scuffs, drops, cracks and other things unlisted here.

Another reason why Apple limits it to a decade is that adding features and security patches further would reduce the user experience of using a Mac as it would slow it down. Is 2023 macOS Sonoma as quick as 2018 macOS Mojave? I remember using Windows 95 at its min sys requirements and its a bad experience. Even when using software at the recommended sys requirements still is a bad experience until now so you really need a bit more to get a great user experience.

I cannot see that occuring using a 2006 C2Q 65nm on 2021 Win11. Much less 2022 Win12.

Here's something to add value to this conversation that others may jump in and talk about.

It may come to a shock to many but shipments of desktop dGPU have been at a downward slope since as early as 2005. This to me indicates that "perfect" 4090-like performance is not selling as well as GPU cores in a SoC.

Use case of decades past are now approaching niche.

9hGBfdHQBWtrbYQKAfFZWD.png


Source: https://www.tomshardware.com/news/sales-of-desktop-graphics-cards-hit-20-year-low
 
Last edited:
  • Like
Reactions: chucker23n1

chucker23n1

macrumors G3
Dec 7, 2014
9,091
12,113
I would also like to direct you to this pretty good explainer on the grand theory of Apple, which I find makes a pretty good argument for AR glasses, while also dropping desktop Macs. It's pretty prescient, considering it was released way before Apple Silicon.


He makes some good points, but he's way overstating the case. Yes, Apple has a culture of "technology should disappear; it's just a tool that needs to get out of the way", but it doesn't follow that "technology is the enemy". Yes, desktops take up a lot of physical and mental space, but no, they're not dropping desktop Macs as a result. Yes, Apple has removed the headphone jack, but so have competitors, and actually, almost nobody cares. Lots of tech pundits fall for this trap: they extrapolate that their personal preferences must apply to most people's preferences, and they just don't.
 

chucker23n1

macrumors G3
Dec 7, 2014
9,091
12,113
Why do they need higher-end chips than the Ultra? It's... not clear... that there is a business case for it.

[..] There are some people whose needs require more than 192GB of RAM (although it's worth noting, Apple did not offer anything with 192GB of RAM or more until 2019). Etc. And Apple is not going to redesign most of their Apple Silicon stack in order to accommodate those niches.

OK, but the million dollar question is: to what extent is it that they made the 2019 Mac Pro, saw that almost nobody actually used more than 192 GiB RAM, and decided "meh, might as well drop that option", vs. to what extent is it that making a SoC package that accommodates more and/or adding hardware and software support for a hybrid approach where you can expand with DIMMs and/or making an Mx chip that has no on-package RAM at all would've been very niche and costly?

To put that another way: if it were relatively simply to make a computer that offers more RAM, would they have bothered to offer it, or was this the ceiling of what almost any customer cared about regardless?
 

chucker23n1

macrumors G3
Dec 7, 2014
9,091
12,113
Look at it this way: With the Mac Pro now on Apple Silicon it's gonna get refreshes a lot more often now.

Maybe.

So far, we've seen a refresh to the Studio after 15 months. That's not too bad. But it wouldn't surprise me if they only update the Mac Pro every other cycle.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Maybe.

So far, we've seen a refresh to the Studio after 15 months. That's not too bad. But it wouldn't surprise me if they only update the Mac Pro every other cycle.
I disagree I see Mac Studio & Mac Pro being refreshed in sync.

So long as the succeeding Ultra/Extremes do not necessitate that much changes other than the SoC itself and slight variations of other components then I can see a refresh occurring on a 15 month cycle.

With the 2023 Mac Pro they have managed to lower down the cost of making it.

Q1 2025 will hopefully address almost all the sore points of the 2023.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
OK, but the million dollar question is: to what extent is it that they made the 2019 Mac Pro, saw that almost nobody actually used more than 192 GiB RAM, and decided "meh, might as well drop that option", vs. to what extent is it that making a SoC package that accommodates more and/or adding hardware and software support for a hybrid approach where you can expand with DIMMs and/or making an Mx chip that has no on-package RAM at all would've been very niche and costly?

To put that another way: if it were relatively simply to make a computer that offers more RAM, would they have bothered to offer it, or was this the ceiling of what almost any customer cared about regardless?
I think it's fairly obvious that the architecture is designed around 8/16/24GB capacities in the M2, then you double that and have 16/32/48GB in the M2 Pro, then double that again and have 32/64/96GB in the M2 Max, and then double that again to 64/128/192GB in the M2 Ultra.

My guess is that, if they wanted to support more than 192GB of unified memory, they would have needed to do something other than just doubling (or design the base M2 for more than 24GB), and that would have been very costly.

Something that allows DIMMs would basically break the whole architecture... plus think of another thing. A memory controller that allows DIMMs needs to be able to handle various specs of RAM, various timings, various multichannel modes, etc. Who knows how many mildly different grades of DDR4 might technically be used on, say, a typical Intel memory controller. These memory controllers have gotten more tolerant over time, too - 30 years ago, the wrong spec/combination RAM and you didn't boot, whereas today, memory controllers will run a lot of less-than-optimal configurations in lower performance modes. Obviously, Intel has decades of experience designing memory controllers and they just accept that that's the requirement. Intel will even give you a single integrated memory controller that, depending on your motherboard vendor's preference, is able to do either DDR4 or DDR5, so who knows how many 'wasted' transistors there are on all those CPUs for the unused memory support. If you solder the RAM like Apple does, you can land on one spec of RAM chips, design the memory controller for that spec, buy only that spec from the RAM manufacturer, and you've removed all that complexity from the memory controller.

I am not an engineer, but one thing I thought they could have done is some kind of proprietary memory module, somewhat like some PowerBooks had in the early 1990s. Somewhat like what they do with the "SSDs" on the Mac studio where the SSD chips are basically an extension of an on-SoC storage controller. But they didn't do that...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.