Not for me, I expect I'll have to give it up eventually unless a hardware solution becomes available for running x86 VM's.
Unless you (a) want to virtualize older x86 versions of macOS and/or (b) virtualize whatever OSes you are to be virtualizing on a Mac, you'll probably always have better luck doing virtualization on a Windows or Linux PC. I have a MacBook Pro (16-inch, 2019) for satisfying both of those things, but even I know that my days for doing this are numbered.
There's just no advantage to me running it, even at home if I can't do me work on it, and for that, I need Windows. WoA really isn't good enough, so currently I have to remote into a Windows PC to do my work, and it's just a short change to just be using the Windows PC to begin with. For the kind of money I spent on my Studio Max, I could have bought a REAL nice Windows machine. I currently just use a Mac because I like them, but that only goes so far. I like Windows too. (not so high on Linux though!) I'm an outlier here, and I know it, so don't take what I say as doom and gloom for Apple. I'd much prefer if Macs and Windows machines hang around so I can buy the best machine for whatever job I need at the time.
Honestly, this is going to be my future as a Mac user too. I'll get whatever machine sits where the current Apple Silicon 13-inch MacBook Pro sits in the Mac lineup and my uses will be light to moderate with the heavier tasks (which, for me are virtualization and gaming) to be handled by Windows PCs (that are, presumably still going to be x86-64). I'd be more bummed about it if I didn't see the writing on the wall for a few years now. It's not like my professional needs mandate high-powered Macs (just maybe my recreational for-*****-and-giggles creativity). For IT and consulting (my main bread and butter), a base model M1 Air will more than suffice.
Agreed, but I'm not seeing any rumors of expansion capabilities at all other than maybe ssd's. I hope I'm wrong because like you say, it's for people that need expansion. The RAM limits I've heard don't really even cover some of the stuff I do (VM based developing/testing) I'm not the target of the Mac Pro and never have been, but I have bought hi end machines for work, so I kind of know what they buy and why.
PCIe slots. Internal SATA. Things the current Mac Pro has and that any one to come after would need, lest it be a Mac Studio. Not something you or I need in a Mac, but others surely do.
The Mac Pro served a lot of different customers in the past. From those that wanted a $2,500 Mac to a $50,000 Mac and all in between.
Today the Mac Studio is already serving a good 70% of the previous Mac Pro market. The Apple Silicon Mac Pro (which is essentially a 2x Mac Studio) will serve about 25% more.
I agree that it's a diminishing market. I also agree that the 27-inch iMac and the Mac Studio that replaced it took away a lot of that market. But I think there's enough of a market that still needs a Mac Pro tower and that, despite how much smaller that market is, Apple would be foolish to abandon it. The only things that seem safe bets to change are that RAM and graphics won't be upgradeable without replacing the SoC. But, if Apple sockets/slots that SoC and allows for such aftermarket upgrades, that'll be a trade-off considered to be acceptable by ENOUGH of that leftover market.
The negligible group of users who remain don't need to be serviced by Apple. Not if serving them is counter to the direction of the Mac itself, which it is.
I don't fully disagree here. But I don't really agree either. High end audio (where latencies make or break) still seems like something where the Mac platform completely dominates over Windows and Linux. Do those require more than an Ultra version Mac Studio with 128GB of RAM? I'm not an audio guy, so I can't say. But I'd imagine that there are use cases wherein even that kind of Mac Studio would fall short. Not many, but enough.
I do think that between 2010 and 2019, Apple did lose enough high end customers such that Windows in the high-end isn't as outlandish of an idea as it might've been in 2010.
But, I also think that Apple is so scared of losing people in any area of their walled off ecosystem that they want to make sure that, for example, someone doesn't end up having to use a Xeon-based Windows workstation and realize that maybe Windows isn't so bad (causing that same customer to not buy a MacBook Pro or MacBook Air for when they are not at that Xeon workstation).
It doesn't seem unfeasible to me that Apple could simply write AMD drivers for ASi based Macs.
Writing the drivers isn't the issue. The issue is that the platform isn't designed for those devices and for those drivers.
That's all that's really missing from those Macs at any level.
No. You're thinking of Apple Silicon macOS and Intel macOS as the same in this regard. They are not at all. x86/x86-64 versions of macOS accommodated those devices and those drivers. Apple Silicon versions of macOS don't work that way at all.
Doing so would allow for eGPUs, and allow for AMD GPUs in future Mac pros.
It wouldn't. The hardware architecture is not built in the same way that Intel Macs as a hardware architecture were. Drivers are not the only missing element here.
It allows them to retain their SOC architecture and so on, while allowing users to upgrade their GPUs or to have multiple GPUs should their workflow support that need.
First off, the graphics subsystems on Apple Silicon do not function in such a way that a traditional GPU would be able to act as an assist. Secondly, even if it was, the graphics card would be so much slower in terms of interconnection to the rest of the SoC that you wouldn't be helping all that much with it. Apple's GPU cores are always going to be faster given that they are literally on the same die as the rest of the system. Thirdly, Apple GPUs and traditional GPUs don't operate the same way at all. Getting one to work in the same way as the other (a) would be impossible and (b) would negate any performance benefits, even if possible.
At best, you might have an expanded Afterburner product line wherein there are added compute modules for given workflows. But you're not going to see traditional GPU workloads.
On the RAM front, I'm not sure. Maybe it is doomed to have SOC only RAM. Maybe not.
Unless Apple drastically does a 180 from how they designed Apple Silicon as a Mac hardware platform, it's a safe bet that RAM will be tied to the SoC.
We can read the rumor mill to death, but ultimately we won't know what to expect until Apple releases their first Mac Pro and we see it ourselves.
No, it's not always this nebulous "we won't know anything until Apple announces anything". In this case Apple HAS revealed how developers should be making their apps. For them to change this drastically from what they've done thus far would require developers to do massive changes in how they write their apps. Considering how slow app developers have been to port over apps to being Apple Silicon native to begin with, this would be such an obviously huge mistake for Apple. You can take to the bank that third party GPUs are still off the table and that RAM will still be unified and tied to the SoC.
It's entirely possible Apple will depart from the SOC style and break it down into modular components specifically for the modular Mac.
It's about as close to being impossible as you can get without it actually being impossible.
That would support their cancellation of the M2 Extreme chip, because they realized that when you get to the high end market, your needs for more RAM, CPU cores, GPU power, etc don't scale linearly with each other. Sometimes you're a pro that needs 5x GPU power and standard CPU power, and sometimes the other way around. Why pay for systems you don't need, until you need it, when you're talking about the high end modular sector?
Again, a socketed/slotted SoC would solve all of those things already. As it stands, each Apple Silicon SoC already has multiple GPU (and, in the case of the M1 Pro and M2 Pro, CPU) core and RAM combinations. You can buy an M1 Ultra with 128GB of RAM, but only its base GPU or the opposite, or entirely base model or entirely maxed out. That's four possible M1 Ultra chip configurations. The M2 Max now has five different possible configurations. Socketed/slotted SoCs would be how that problem gets solved. Not as much flexibility as you have in the Intel era. But it's been that way since the very first M1.
I'm pretty sure Apple knows all of that, and the struggle to build something that can check the modular requirement box is probably why the Mac Pro has been the last in the lineup to be updated. It's not that they can't just interconnect more chips, it's that interconnecting more chips doesn't always make sense.
Right. Doubling M1 Ultra or M2 Ultra wouldn't make sense. You would get the right CPU core count, but you wouldn't get the right RAM support. The Mac Studio was very careful to provide the same exact same maximum RAM capacity that the 2020 27-inch iMac had. Apple needs to at least get close to being able to offer the same 1.5TB maximum capacity offered by the current Mac Pro (or, at the very very least, the 768GB maximum RAM that you can get from a 16-core model). I'd imagine that the graphics options also need to be beefier than that of the M1 Ultra's highest-end 64-GPU core configuration or even the 76-GPU core configuration of an M2 Ultra that doubled the M2 Max's highest end GPU core configuration. Then again, it's also totally possible that you have a Mac Pro customer that is fine with the performance of an M2 Max and just needs PCIe slots. I'd imagine Apple will give us options when configuring such a machine.
From what I understand, any GPU drivers would need to deal with UEFI hooks for boot up when Apple silicon Macs don't have UEFI at all. I'm not sure how that would get resolved. Maybe it doesn't matter since the default boot display could be the internal GPU.
That's not even half of the obstacles. macOS on Apple Silicon is architected to not support third party GPUs at a very fundamental level. It's why you're not hooking up an eGPU to your iPad Pro with Thunderbolt.
They don't. According to Asahi Linux developers, in theory it's perfectly possible to make eGPU drivers for Apple Silicon – by third parties, even. The only issue is that memory mapping is extremely hard even for seasoned developers.
I would imagine that, given how outright Apple has been on this topic, that there's a whole lot more stopping it from happening than that. It's my understanding that even if you connected the GPU to the Apple Silicon Mac, the Apple Silicon Mac isn't going to know what to do with it and well beyond it simply not having a driver.