Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Mac pro users are a tiny number and they can do just fine without these customers.

It's almost a shame they decided to make the 7,1. They clearly put a lot of thought into it - it's a beautiful and highly functional machine - but it gave false hope to high-end desktop users on the Mac platform. If Apple had just stuck to their guns and anointed the iMac Pro as the Mac Pro replacement, its users would have had a smooth transition into the Apple Silicon era. A Studio Ultra + Studio Display clearly makes more sense than a Xeon iMac, so there would have been few complaints.

Former Mac Pro users could then have made their exit to Windows years ago. Would have been kinder to them, but perhaps Apple wanted to hedge their bets, in case the transition didn't go smoothly.
 
They've cancelled the Extreme version of 3 separate generations of M series chips so far...and while they may still be releasing the M4 Extreme "as reports have released saying they've cancelled, but logically that would be an M5 extreme if they did this late in the game"...I do NOT think they were ever working on an M4 Extreme in the first place.
Well, they've only "cancelled" something that had never been announced, promised or (as far as I know) leaked on Geekbench... there are enough leaks to show that they were probably investigating the idea of a 4xMax "extreme" chip, but I don't think any details ever emerged of how far they got (I may be wrong). One would expect that, for every concept that reached production, Apple R&D had "investigated" and abandoned several other concepts. There are plenty of mundane reasons why the "extreme" idea might have been dropped: too difficult to fuse more than 2, too expensive/too low yield to produce, disappointing performance, inefficient use of space (duplicating stuff that they didn't need just to create more GPU cores or PCIe lanes)...

That said...there is only ONE reason to keep the 2019 Mac Pro chassis, and that is expandability...

Which is why the current M2 Ultra Mac Pro exists - for a niche of people who need high-bandwidth PCIe for specialist I/O or audio/video cards and/or large internal SSD arrays, but not discrete GPUs or DDR5 RAM, since integrated GPUs and unified memory are integral to the whole Apple Silicon concept.

Also, while they've kept some of the 2019 Chassis, the main (practical) Unique Selling Point was the MPX slot system that allowed large PCIe cards like GPUs to get extra power and feed back video to the main Thunderbolt controller without all of the fugly extra cabling that requires in an ATX PC - and that has gone.

I think they are likely creating a slot card system that would allow for personal GPU expansion. I could see them creating something that was your primary chip, as they already have the fastest CPU on earth...

People were speculating about this before the somewhat disappointing M2 Ultra Pro appeared, and there was some mumbling about something called a "compute module":


(Which I think turned out to be something to do with Vison Pro...?)

...but yeah, a system of plug-in cards with Mx Max/Ultra/Extreme/Ludicrous for a scalable system sounded like a good idea, and a potential good use for the MPX card format - possibly even available as an expansion for the 2019 Pro. But, MPX is gone, and looking at the size of Max and Ultra mainboards in the Studio, MPX cards would be unnecessarily large.

...so then WHY are they still bothering all this time with a Mac Pro...
Not sure that they will be bothering to replace the 2023 Mac Pro - or do more than just bump the processor if/when the next Ultra chip is released. One good reason for keeping the 2019 case design would be that the 2023 MP was going to be "end of the line". It's main reason for existing is PCIe expansion, which will likely be a dwindling requirement, especially as Thunderbolt 5/USB4v2 will be significantly increasing the bandwidth available for external devices (including TB-to-PCIe enclosures, which didn't see any improvement from TB4).

It's meant to be a show piece. It's supposed to be the "can't innovate my @$$" response to that continued annual question.
Except it isn't any more. Even the 2019 was really "just a Xeon W tower with neater internals" - it got a slight head start over generic PC systems with the extra PCIe and RAM capacity of the then-newish Xeon W but even then could be thrashed on straight performance/RAM/PCIe capacity by systems using multiple Scaleable Xeon chips, or price/performance by AMD systems that didn't carry the "Intel Xeon Tax". At best, it was one of those "perfect if this is exactly what you need" products - in a price bracket where users needs tended to be increasingly specialised and generic PC hardware offered an unmatched level of choice.

Since 2020, Apple's innovative showpiece has been how much power they can cram into ultrabooks, tablets and small-form-factor devices with long battery lives and quiet cooling. The Apple Silicon GPU really can't compete with NVIDIA pr AMDs finest discrete cards but it thrashes anything else you can fit in an ultrabook and realistically run off battery - esp. with natively optimised software. The core principles of Apple Silicon make it ideal for that - and also let it scale up to pretty powerful machines like the Studio - but they're really not what you want for a high-end workstation to run traditional workflows.

If you really need a "big box o' slots" and you're not software-locked into MacOS then your AMD system is probably the better tool for the job (and probably would already have been in 2019).

There's also the question of "who is going to need a super-powerful, monolithic personal workstation in the near future?" Cloud/thin-client computing (whether it's the "public" cloud or a rack of servers in the basement) seems to be the future - with the ability to "rent" extreme computing and storage capacity as and when you need it. Any current Mac has the legs to be a front end, something like a Mac Studio (assuming M4 Max/Ultra versions appear) already give you a ton of local power.

What there has been a rumour of is Apple working on an AI server chip:


...but that's likely to be something rather different (I'd guess closer to the NVIDIA Grace/Hopper - which in turn has a whiff of Apple Silicon re-imagned for AI/High Density Computing) and probably mainly designed to allow Apple to eat their own dogfood for their online services. It may well crop up in something called a "Mac Pro" but that would be a very different beast aimed at a different market. For one thing, the tools used for modern AI and server development/production tend to be Linux/Unix- rather than Windows- centric, which is handy, because MacOS is Unix.

Why have they EVER made Mac Pros? They've NEVER been a profitable venture.

There have been "pro" Macs since long before the Mac Pro name existed - probably starting with the Macintosh II - and throughout the late 80s and early 90s they ruled high-end DTP, graphics and the early days of video/multimedia production. The 16-bit DOS PCs of the era simply weren't up to that. Apple's problems really started when PCs not only started playing catchup on the tech, but became ridiculously cheap because of economies of scale. Still, Mac carved out a niche in graphics/DTP/video/audio production which got decimated, but not eliminated, by the rise of the PC.

So, on one front, it would be embarrassing for Apple to walk away. Also, it would have jeopardised future support from the likes of Adobe and other "industry standard" software houses - which now had perfectly viable PC versions of their software... and more generally developers liked their Mac Pros, so even if the Mac Pro wasn't a direct moneyspinner it was strategically important.

I think that's started to wane since ~2010 (of course - that's also when Apple started dropping the ball on the Mac Pro so there's a chicken/egg question there). For one thing - there's the "good enough" problem: in the Good Old Days if you were a developer, if you were any sort of video editor, if you were producing audio or doing a lot of graphics work you needed the power and expandability of the Mac Pro. But laptops, Minis and iMacs have been getting dramatically better in relation to desktop systems - today, there's not a lot you can't do at a basic level on a MacBook Air, the only reason a developer needs a Mac Pro is if the product they're developing needs a Mac Pro to run - and while people actually working in the movie/TV industry still need high-end kit, a Mac Studio will do everything even the fairly serious "prosumer" needs. The fact that the entry level price for the Mac Pro has risen from ~$2500 for the classic Cheesegrater to $7000 for the 2023 M2 Ultra Mac Pro (which only has the same performance as the $4k Studio - and will only outperform a M2 Max MacBook Pro on sufficiently parallel workloads) reflects that it's now a lot more of a "serious callers only" product (again - maybe chicken and egg, but we have to assume that Apple aren't totally stupid and market-research these things).

If Apple had just stuck to their guns and anointed the iMac Pro as the Mac Pro replacement, its users would have had a smooth transition into the Apple Silicon era. A Studio Ultra + Studio Display clearly makes more sense than a Xeon iMac, so there would have been few complaints.

Former Mac Pro users could then have made their exit to Windows years ago.
Interesting that you think that the iMac Pro was supposed to be the new Mac Pro (I totally agree - but haven't had floods of support in that view). My impression was that - when Apple made their famous early-2017 U-turn meeting - they would probably have just started showing the iMac Pro prototypes to key software/hardware developers and were getting severe pushback (the timing seems about right - whatever you think of the iMac Pro it clearly hadn't been kludged together in a hurry, and it would have been in an advanced state of development by then). As I said above, part of the reason for Apple continuing to make high-end Macs is to keep key developers on board, so they probably had to react.

I still think that the iMac Pro (on its own, without a complementary headless alternative) was a mistake and that a lot of "higher end" users have specialised display requirements and simply wouldn't want an all-in-one, however nice the built-in display was. Really, Apple had one job to do after that 2017 press conference: get on the phone to Foxconn, order up a few shipping containers full of Xeon mini-towers in nice aluminium tool-free cases and release an "official Hackintosh" (they wouldn't have called it that, of course, the point is that getting MacOS to run well on generic Intel hardware was not rocket surgery, especially if you were Apple and didn't have to fight the DRM). People who liked the iMac Pro concept would still have bought it, but Apple would have had something to offer the people who probably didn't buy the iMac Pro and either skipped to Windows, built their own Hackintosh or squeezed another year or two out of their old Cheesegraters (which were effectively Hackintoshes by that stage).
 
I still think that the iMac Pro (on its own, without a complementary headless alternative) was a mistake and that a lot of "higher end" users have specialised display requirements and simply wouldn't want an all-in-one, however nice the built-in display was.

True. I think the issue was that Apple were struggling to justify the Mac Pro line due to the low sales volumes. They tried radically reducing its size, but that wasn't popular, and had no thermal headroom. They tried repurposing the iMac, giving it a Xeon and making it Space Grey (OK, and a new cooling system), but as the release date approached, realised it was little improvement on the 6,1. Sure, it had newer / faster components, but didn't resolve the fundamental issues. When they finally threw in the towel and returned to a tower, they had to double the price to balance the books.

Basically, Apple were struggling to justify one Mac Pro, let alone two.

Really, Apple had one job to do after that 2017 press conference: get on the phone to Foxconn, order up a few shipping containers full of Xeon mini-towers in nice aluminium tool-free cases and release an "official Hackintosh" (they wouldn't have called it that, of course, the point is that getting MacOS to run well on generic Intel hardware was not rocket surgery, especially if you were Apple and didn't have to fight the DRM).

Yeah, hard to see why Apple made such a meal of the 7,1. Especially as they ditched it after one generation, along with its processor architecture. But I guess everything has to be a design statement from Apple. Plus, they can never admit they get anything wrong, and releasing something that was essentially a refreshed cMP (in spirit) would be too close to the bone. It would look like they were going backwards. Worse, it would look like they were being lead by their customers, rather than the other way round. They had to show them who's boss.

People who liked the iMac Pro concept would still have bought it, but Apple would have had something to offer the people who probably didn't buy the iMac Pro and either skipped to Windows, built their own Hackintosh or squeezed another year or two out of their old Cheesegraters (which were effectively Hackintoshes by that stage).

You'd think there would have been a place for a 'Pro' iMac, at the top of the range, especially as HDDs got phased out of the standard iMac. They could have shared a similar internal architecture / cooling system, with the Pro getting a Xeon, posher GPU options and exclusive Space Grey casing / accessories. The issue would likely have been cannibalisation of the lower-end Mac Pro models, especially when relaunched at $5K and climbing steeply from there.
 
Last edited:
...
Also, while they've kept some of the 2019 Chassis, the main (practical) Unique Selling Point was the MPX slot system that allowed large PCIe cards like GPUs to get extra power and feed back video to the main Thunderbolt controller without all of the fugly extra cabling that requires in an ATX PC - and that has gone.
...
And it also helped to keep the Mac Pro 2019 quiet, with the excellent cooling system. Despite having two dual graphic cards inside. To bad the PC side scrapped the SLI concept. So extremely limited what software can use 4 GPUs.

Even the 2019 was really "just a Xeon W tower with neater internals" - it got a slight head start over generic PC systems with the extra PCIe and RAM capacity of the then-newish Xeon W but even then could be thrashed on straight performance/RAM/PCIe capacity by systems using multiple Scaleable Xeon chips, or price/performance by AMD systems that didn't carry the "Intel Xeon Tax". At best, it was one of those "perfect if this is exactly what you need" products - in a price bracket where users needs tended to be increasingly specialised and generic PC hardware offered an unmatched level of choice.
The price was certainly an issue. Almost 5 years later, people don't care because it's an obsolete mac platform. Even though the prices are starting to be good.
It is still a great computer for those who wish to run Windows beside macOS.

W6800X Duo.jpeg
Vega II & R4i.jpeg
RME.jpeg


I still think that the iMac Pro (on its own, without a complementary headless alternative) was a mistake and that a lot of "higher end" users have specialised display requirements and simply wouldn't want an all-in-one, however nice the built-in display was.
And in contrast to the iMac Pro, you can easily upgrade parts. And doesn't have a built in display.

The most important point is.
The Mac Pro 2019 exists in the physical world, the Apple M* Extreme machines only exists in the imaginary world (where all things are first created) ;)
 
Last edited:
  • Like
Reactions: maikerukun
There's also the question of "who is going to need a super-powerful, monolithic personal workstation in the near future?" Cloud/thin-client computing (whether it's the "public" cloud or a rack of servers in the basement) seems to be the future -

Maybe the solution is that all Macs end up being low powered and anyone who needs power beyond what's needed to send email or browse the web can have a subscription "power by the hour" type arrangement. If they need extra power, they pay for it.

Apple could reduce the price of the computers slightly, but add mega-profits from the subscription model for extra processing power.

To bad the PC side scrapped the SLI concept. So extremely limited what software can use 4 GPUs.

Have to agree. Instead we see single GPUs pushing massive power.
 
Maybe the solution is that all Macs end up being low powered and anyone who needs power beyond what's needed to send email or browse the web can have a subscription "power by the hour" type arrangement.
Well, that's one plausible scenario - all personal computing moving to a thin client model - and I wouldn't rule it out since some things are already heading in that direction.

However, in the context of this thread, I was thinking more of "anyone who needs power beyond the considerable clout provided by, say, a M4 Max MacBook Pro" - and didn't mean making current Macs lower-powered. A modern MBP or Mini can do things that, a few years ago, would have required a tricked-out Mac Pro - and which still need doing today. Many of the "new" things that need doing - whether it is AI training or rendering feature-length movies at theatre-quality resolution - are probably best done in high-density computing racks especially if they rely on huge datasets - with computing power rented only when it is actually needed.
 
  • Like
Reactions: maikerukun
Apple's trajectory isn't a big powerful workstation that does everything, it's putting low powered computing into every object, and making them all clients of Apple's integration cloud (which is going to go spectacularly wrong, becaue the EU is gearing up to dictate open access and replacability of Apple's proprietary bits on a feature by feature basis in all Apple's OSes).
I keep getting bans for making political comments outside of the politics forum, so I'll just say the EU are wrong on many levels for their intervention.

Apologies in advance to the mods if I bust the rules yet again...it is IMPOSSIBLE to talk about BS EU intervention in private business without being political, because that move is itself political.

I think Apple are in their own way working towards their own GPU tech. nVidia, as good as they are, aren't the only game in town, and as far as I can see, Apple are only just getting started. The M-series has been the "can we do this?" test-run. I think there is much more coming in the decade ahead.

Intel are floundering; AMD is current king of the PC desktop, and nVidia are getting stuck into AI compute.

nVidia said recently that the PC gaming market is not their focus, either, and the 50xx series cards are going to be very expensive. Intel are trying, but will not really make a success of their second attempt at breaking into the GPU market for the same reasons their CPUs are no longer the best.
 
Last edited:
  • Love
Reactions: maikerukun
I keep getting bans for making political comments outside of the politics forum, so I'll just say the EU are wrong on many levels for their intervention.

The EU puts their citizens ahead of an American business. Deal with it, because that's the EU's purpose. Apple's products are going to be regulated feature, by feature, the same way a building is regulated in every aspect of every dimension.

Go look up the regulations for a staircase, or a doorway, or fire suppression. Better buildings are the result, and Apple will either become better for regulation, or cease to exist, and there will be one less economic oxygen thief in the tech world.

Apologies in advance to the mods if I bust the rules yet again...it is IMPOSSIBLE to talk about BS EU intervention in private business without being political, because that move is itself political.

It's a public corporation, and public corporations bend a knee to governments, who are the will of their people. That is the deal they get for their shareholders having limited liability, and not being forced to sell their homes to cover the company's debts when it goes bankrupt.

I think Apple are in their own way working towards their own GPU tech. nVidia, as good as they are, aren't the only game in town, and as far as I can see, Apple are only just getting started. The M-series has been the "can we do this?" test-run. I think there is much more coming in the decade ahead.

Dude, they haven't been able to make an Ultra since the M2, they're not "just getting started", they're running the tank empty after having picked all the low-hanging fruit (to horribly mix metaphors).


nVidia said recently that the PC gaming market is not their focus, either, and the 50xx series cards are going to be very expensive. Intel are trying, but will not really make a success of their second attempt at breaking into the GPU market for the same reasons their CPUs are no longer the best.

You'll still be able to build an X86 (or even ARM) PC system with a 50XX GPU that will wipe the floor with any Mx Mac coming for the next 5 years for less than the price of a Mac Studio.
 
You'll still be able to build an X86 (or even ARM) PC system with a 50XX GPU that will wipe the floor with any Mx Mac coming for the next 5 years for less than the price of a Mac Studio.
By the time Apple gets its act together there will be NVidia RTX9090 GPUs or even 5 digit models.

The future for Apple is most likely not going to be some new wonder GPU, but cloud computer power that people pay subscription for depending on how much power they use.

That’s the best way Apple can supercharge its future profits. It’s not good for users but it won’t be the first time Apple disregards users.

Other companies do similar, you pay a yearly contract fee then have unit charges on top of that depending on how many uses of whatever happen.


modern MBP or Mini can do things that, a few years ago, would have required a tricked-out Mac Pro - and which still need doing today.

How many screens can a Mac Mini or MBP drive?

I’m guessing less 6K displays than the previous Mac Pro. Unless Apple moves everything to a cloud computing subscription model, I think they could just say that “pro” or “studio” users are a niche that isn’t necessary and force those people to use PCs.
 
Last edited:
The Mac Pro never 'demonstrated the pinnacle of their prowess'. The cMP had fine industrial design, but in all other respects was a typical Xeon workstation, using Intel parts. It was a Dell / HP / Lenovo in a nice aluminium case, running macOS. It was hardly the last word in workstation performance; I don't think they even offered Nvidia GPUs after about 2009. They clearly wanted to exit the workstation market around 2010, and only brought back the tower in 2019 after trying to fob pro users off with a small appliance and an iMac in the meantime.

The 2019 was a tour de force, sure. Though even that used Intel as people increasingly moved to AMD's Threadripper, still didn't support Nvidia GPUs, and was shipped with the knowledge that a move to ARM was happening 6 months later. Then, as with the 2013 Mac Pro and the 2017 iMac Pro, received no updates and was essentially discontinued after one generation. MPX went nowhere, and Apple lost all interest in supporting newer GPU generations once Apple Silicon was announced.

The M2 Ultra Mac Pro is a total joke, with zero effort put in. Like the first generation MBA, MBP and mini it reuses the chassis of the prior Intel model, for continuity. But architecturally, it's just a Studio with whatever spare PCIe lanes they could scrounge up PLX'd out to a load of slots.

Objectively, the Mac Pro looks like a product they've been hoping to discontinue for a very long time. If the Extreme were truly something they were interested in, it would have come out with the M1 / M2. What would be the point of waiting 5 years? By that point, customers such as yourself will have long moved to Windows.

The Extreme was always fan fiction. With PCIe GPUs off the menu, a 2x Ultra was the only logical option people could speculate on. But given 2x RTX 5090s would still obliterate it for 3D work, at much lower cost, what would be the point?
Hmmm...

I personally believe they haven't been WAITING so much as they haven't cracked the nut on what it is they've been trying to do. I don't think the Mac Pro should have an M series chip at all...it should be it's own thing, and I think that's what thye' been working on. Also, you have to remember, I'm someone that owns a Puget system with 2 RTX 4090's, and I will likely swap those for RTX 5090's since 2 of those will be equivalent to 4 of what I have now...All of that said...I STILL am chomping at the bit to get a Mac Pro that is even just equivalent to my current Puget system. I don't need it to equate to 5090's...if I am able to run Octane and Redshift and Unreal 5 and bend them to my will as I currently do on the Puget, then I will literally, and I mean literally, end up with the Puget as a paperweight. That's how much I would love to have 100% of my workflow back on Apple. And there's no way Apple isn't aware of that LOL. They must know how many of us would actually happily shovel over $10k - $20k for that particular piece of tech.

And maybe you're right. Maybe it's fan fiction...but maybe it's not.
 
it's really very simple though :
whatever the apple silicon cpu, we want to be able to add nvidia gpus.
100% this. That would literally solve EVERYTHING lol. But I know they won't go backwards, so even a 4090 equivalent 2x would be enough to suffice for me lolol.
 
I assume you meant M4 Max? How much do you think Apple would charge for a Mac Pro with 8x M4 Max? And why would anyone buy one over a PC with e.g. 3x RTX 5090's, for much less? So they can use macOS? I mean, it's nice, but not much different to Windows when in 3D apps all day.



Sick burn. I guess to an accountant's eye, macOS looks like a loser compared to iOS.
Funny you said this, I just replied to your other comment basically answering this question before you asked it LMFAO! But think about it. You remember when I shelled out for that 2019 Mac Pro. If I was willing to pay for it when it was gimped and only for a 1 year window, you're damn straight i'll pay for it on steroids LOL And my guess is it would fall directly in the middle of the now and the then in terms of price...maxed out likely $17k or so.
 
Or any GPU from AMD or Nvidia. But it won't happen. Suggestions/requests to that effect just hit a brick wall at Apple. And on enthusiast forums, people say it's not Apple restricting this, but the GPU manufacturers, then the GPU manufacturers like AMD point fingers at Apple, so it just goes in circles.




But those 2x RTX 5090s aren't efficient. Even though the Apple solution might be obliterated, it's "efficient". I'm sure you've read those kinds of comments before.

Doesn't really bother me, I'm just interested in performance/speed/reliability. I suppose others doing work also probably care about how quickly the machine does the task it is given.

I've been very happy with the 2019 Mac Pros I have because in both Windows and MacOS you can just keep throwing more and more at them and they just chew through it without bother. I've never even had the cooling fans crank up on them. That's expected given they are both high spec machines. One was upgraded to 2.5ghz 28 core by me, the other was that way to start with.

And that CPU upgrade was simplicity. Far from the stories of doom and gloom, it was easy to do and the machine is beautifully designed to make working on it super easy.
Yep. The 2019 Mac Pro is still a beast, especially if you look at what it's capable of instead of comparing it to other systems. Do you remember years ago I shared that link to that animation studio that made the Disney/Hulu/ESPN bundle ads? Those were ENTIRELY MADE on one Mac Pro...fantastic work "with an intel mac mini render farm of 20 mac minis"...here's a link to refresh everyone's memory on what the 2019 Mac Pro can truly do. It's still a beast don't get it twisted :)

 
  • Like
Reactions: Regulus67
It's ultimately an argument over who would pay for driver development. AMD / Nvidia aren't going to go to the hassle of writing macOS drivers - and keeping up with the annual OS changes - for the tiny number of GPUs they'd sell to 2019 Mac Pro owners. Especially as Intel macOS is on the verge of retirement - in 6 months, WWDC may well confirm this year's release as the last for Intel. Apple Silicon doesn't support PCIe GPUs at all, so when Intel macOS dies, all that work would be lost.

On the other side, Apple sure as hell won't pay for development, either. They want to move 100% to Apple Silicon ASAP, so hardly want to extend the life of 2019 Mac Pros another few years. Apple will be hoping the M4 Ultra will finally be sufficient to get most 2019 users to jump to AS.
This said, out of curiosity, what kind of performance leap in an M based Mac Pro would you need to see to even move the needle on you upgrading from the 2019 Beasts?
 
I have a large solar power system so it’s not an issue. The replacement machines will probably have NVidia GPUs and will run Windows.

macOS is nice to use but my work can be done on Windows. I don’t fit in with the Apple business model. I’m sure others will move over the same and help Apple with its transition. Mac pro users are a tiny number and they can do just fine without these customers.
That unfortunately is the sad truth. Which is why I still hold on to the hope that they just want to show off with a seriously hardcore Mac Pro. And my ultimate wish is that for the past 5 years they've literally just been working on an entirely new way to see GPU and GPU functionality, just as they set the industry ablaze with the transition from intel to M series seemingly out of nowhere, I'm hoping a similar trick has been gestating up their sleeve for the past 5 years...
 
It's almost a shame they decided to make the 7,1. They clearly put a lot of thought into it - it's a beautiful and highly functional machine - but it gave false hope to high-end desktop users on the Mac platform. If Apple had just stuck to their guns and anointed the iMac Pro as the Mac Pro replacement, its users would have had a smooth transition into the Apple Silicon era. A Studio Ultra + Studio Display clearly makes more sense than a Xeon iMac, so there would have been few complaints.

Former Mac Pro users could then have made their exit to Windows years ago. Would have been kinder to them, but perhaps Apple wanted to hedge their bets, in case the transition didn't go smoothly.
Hmmmm, this is a very good point. But when has anything they ever done been unintentional? Surely they expected the blowback, and surely they expected to regain trust with the announcement of their ACTUAL 2019 MP replacement at say, the 2025 WWDC?
 
Well, they've only "cancelled" something that had never been announced, promised or (as far as I know) leaked on Geekbench... there are enough leaks to show that they were probably investigating the idea of a 4xMax "extreme" chip, but I don't think any details ever emerged of how far they got (I may be wrong). One would expect that, for every concept that reached production, Apple R&D had "investigated" and abandoned several other concepts. There are plenty of mundane reasons why the "extreme" idea might have been dropped: too difficult to fuse more than 2, too expensive/too low yield to produce, disappointing performance, inefficient use of space (duplicating stuff that they didn't need just to create more GPU cores or PCIe lanes)...



Which is why the current M2 Ultra Mac Pro exists - for a niche of people who need high-bandwidth PCIe for specialist I/O or audio/video cards and/or large internal SSD arrays, but not discrete GPUs or DDR5 RAM, since integrated GPUs and unified memory are integral to the whole Apple Silicon concept.

Also, while they've kept some of the 2019 Chassis, the main (practical) Unique Selling Point was the MPX slot system that allowed large PCIe cards like GPUs to get extra power and feed back video to the main Thunderbolt controller without all of the fugly extra cabling that requires in an ATX PC - and that has gone.



People were speculating about this before the somewhat disappointing M2 Ultra Pro appeared, and there was some mumbling about something called a "compute module":


(Which I think turned out to be something to do with Vison Pro...?)

...but yeah, a system of plug-in cards with Mx Max/Ultra/Extreme/Ludicrous for a scalable system sounded like a good idea, and a potential good use for the MPX card format - possibly even available as an expansion for the 2019 Pro. But, MPX is gone, and looking at the size of Max and Ultra mainboards in the Studio, MPX cards would be unnecessarily large.


Not sure that they will be bothering to replace the 2023 Mac Pro - or do more than just bump the processor if/when the next Ultra chip is released. One good reason for keeping the 2019 case design would be that the 2023 MP was going to be "end of the line". It's main reason for existing is PCIe expansion, which will likely be a dwindling requirement, especially as Thunderbolt 5/USB4v2 will be significantly increasing the bandwidth available for external devices (including TB-to-PCIe enclosures, which didn't see any improvement from TB4).


Except it isn't any more. Even the 2019 was really "just a Xeon W tower with neater internals" - it got a slight head start over generic PC systems with the extra PCIe and RAM capacity of the then-newish Xeon W but even then could be thrashed on straight performance/RAM/PCIe capacity by systems using multiple Scaleable Xeon chips, or price/performance by AMD systems that didn't carry the "Intel Xeon Tax". At best, it was one of those "perfect if this is exactly what you need" products - in a price bracket where users needs tended to be increasingly specialised and generic PC hardware offered an unmatched level of choice.

Since 2020, Apple's innovative showpiece has been how much power they can cram into ultrabooks, tablets and small-form-factor devices with long battery lives and quiet cooling. The Apple Silicon GPU really can't compete with NVIDIA pr AMDs finest discrete cards but it thrashes anything else you can fit in an ultrabook and realistically run off battery - esp. with natively optimised software. The core principles of Apple Silicon make it ideal for that - and also let it scale up to pretty powerful machines like the Studio - but they're really not what you want for a high-end workstation to run traditional workflows.

If you really need a "big box o' slots" and you're not software-locked into MacOS then your AMD system is probably the better tool for the job (and probably would already have been in 2019).

There's also the question of "who is going to need a super-powerful, monolithic personal workstation in the near future?" Cloud/thin-client computing (whether it's the "public" cloud or a rack of servers in the basement) seems to be the future - with the ability to "rent" extreme computing and storage capacity as and when you need it. Any current Mac has the legs to be a front end, something like a Mac Studio (assuming M4 Max/Ultra versions appear) already give you a ton of local power.

What there has been a rumour of is Apple working on an AI server chip:


...but that's likely to be something rather different (I'd guess closer to the NVIDIA Grace/Hopper - which in turn has a whiff of Apple Silicon re-imagned for AI/High Density Computing) and probably mainly designed to allow Apple to eat their own dogfood for their online services. It may well crop up in something called a "Mac Pro" but that would be a very different beast aimed at a different market. For one thing, the tools used for modern AI and server development/production tend to be Linux/Unix- rather than Windows- centric, which is handy, because MacOS is Unix.



There have been "pro" Macs since long before the Mac Pro name existed - probably starting with the Macintosh II - and throughout the late 80s and early 90s they ruled high-end DTP, graphics and the early days of video/multimedia production. The 16-bit DOS PCs of the era simply weren't up to that. Apple's problems really started when PCs not only started playing catchup on the tech, but became ridiculously cheap because of economies of scale. Still, Mac carved out a niche in graphics/DTP/video/audio production which got decimated, but not eliminated, by the rise of the PC.

So, on one front, it would be embarrassing for Apple to walk away. Also, it would have jeopardised future support from the likes of Adobe and other "industry standard" software houses - which now had perfectly viable PC versions of their software... and more generally developers liked their Mac Pros, so even if the Mac Pro wasn't a direct moneyspinner it was strategically important.

I think that's started to wane since ~2010 (of course - that's also when Apple started dropping the ball on the Mac Pro so there's a chicken/egg question there). For one thing - there's the "good enough" problem: in the Good Old Days if you were a developer, if you were any sort of video editor, if you were producing audio or doing a lot of graphics work you needed the power and expandability of the Mac Pro. But laptops, Minis and iMacs have been getting dramatically better in relation to desktop systems - today, there's not a lot you can't do at a basic level on a MacBook Air, the only reason a developer needs a Mac Pro is if the product they're developing needs a Mac Pro to run - and while people actually working in the movie/TV industry still need high-end kit, a Mac Studio will do everything even the fairly serious "prosumer" needs. The fact that the entry level price for the Mac Pro has risen from ~$2500 for the classic Cheesegrater to $7000 for the 2023 M2 Ultra Mac Pro (which only has the same performance as the $4k Studio - and will only outperform a M2 Max MacBook Pro on sufficiently parallel workloads) reflects that it's now a lot more of a "serious callers only" product (again - maybe chicken and egg, but we have to assume that Apple aren't totally stupid and market-research these things).


Interesting that you think that the iMac Pro was supposed to be the new Mac Pro (I totally agree - but haven't had floods of support in that view). My impression was that - when Apple made their famous early-2017 U-turn meeting - they would probably have just started showing the iMac Pro prototypes to key software/hardware developers and were getting severe pushback (the timing seems about right - whatever you think of the iMac Pro it clearly hadn't been kludged together in a hurry, and it would have been in an advanced state of development by then). As I said above, part of the reason for Apple continuing to make high-end Macs is to keep key developers on board, so they probably had to react.

I still think that the iMac Pro (on its own, without a complementary headless alternative) was a mistake and that a lot of "higher end" users have specialised display requirements and simply wouldn't want an all-in-one, however nice the built-in display was. Really, Apple had one job to do after that 2017 press conference: get on the phone to Foxconn, order up a few shipping containers full of Xeon mini-towers in nice aluminium tool-free cases and release an "official Hackintosh" (they wouldn't have called it that, of course, the point is that getting MacOS to run well on generic Intel hardware was not rocket surgery, especially if you were Apple and didn't have to fight the DRM). People who liked the iMac Pro concept would still have bought it, but Apple would have had something to offer the people who probably didn't buy the iMac Pro and either skipped to Windows, built their own Hackintosh or squeezed another year or two out of their old Cheesegraters (which were effectively Hackintoshes by that stage).
It's funny you mentioned Ai, as I was thinking that likely had a huge part in how they were gonna handle some of their GPU roadblocks...and speak of the devil, did you see Nvidia's keynote today? Because that's exactly what the RTX 5000 series is doing...and AMD's keynote as well, they're also doing it...and while you may be right, it could have to do with something else entirely, I won't give up hope that they're creating something fascinating behind closed doors that we simply don't know about as of yet.
 
And my ultimate wish is that for the past 5 years they've literally just been working on an entirely new way to see GPU and GPU functionality.

Here's the problem for me - I don't want them to do that. I want the GPU to drive displays and graphics, and I want that to be an aspect of the system I can change, and enlarge independently of the rest of the system, as my display and graphics needs change.

I don't want my "graphics" card doing (non-image) compute, or AI, or anything else.
 
Last edited:
  • Like
Reactions: maikerukun
Here's the problem for me - I don't want them to do that. I want to the GPU to drive displays and graphics, and I want that to be an aspect of the system I can change, and enlarge independently of the rest of the system, as my display and graphics needs change.

I don't want my "graphics" card doing (non-image) compute, or AI, or anything else.
That's fair. I get that. I'm just daydreaming about the different configs that they could possibly be experimenting with over at the lab. I know they've probably ran several dozen different prototypes and likely have something jaw drop incredible over there...To be a fly on that wall...
 
I personally believe they haven't been WAITING so much as they haven't cracked the nut on what it is they've been trying to do.

Perhaps they should just give their customers what they want. If they need any hints, they can just visit the Puget website. The complication lies in finding an overlap (if any) between what their customers want and what Apple would like to make.

I don't think the Mac Pro should have an M series chip at all...it should be it's own thing, and I think that's what thye' been working on.

Presumably it would still be ARM based though? You're talking about a massive ARM CPU, with loads of PCIe lanes for GPUs?

Also, you have to remember, I'm someone that owns a Puget system with 2 RTX 4090's, and I will likely swap those for RTX 5090's since 2 of those will be equivalent to 4 of what I have now...All of that said...I STILL am chomping at the bit to get a Mac Pro that is even just equivalent to my current Puget system. I don't need it to equate to 5090's...if I am able to run Octane and Redshift and Unreal 5 and bend them to my will as I currently do on the Puget, then I will literally, and I mean literally, end up with the Puget as a paperweight. That's how much I would love to have 100% of my workflow back on Apple. And there's no way Apple isn't aware of that LOL.

They must know how many of us would actually happily shovel over $10k - $20k for that particular piece of tech.

Therein lies the problem.

Funny you said this, I just replied to your other comment basically answering this question before you asked it LMFAO!

Well technically, I had already made both posts before you replied to the first ;)

But think about it. You remember when I shelled out for that 2019 Mac Pro. If I was willing to pay for it when it was gimped and only for a 1 year window, you're damn straight i'll pay for it on steroids LOL And my guess is it would fall directly in the middle of the now and the then in terms of price...maxed out likely $17k or so.

But you're a self-admitted Apple super-fan. You're certainly not the only one, but are there enough to make the Mac Pro a product Apple is particularly interested in making? The evidence would suggest not. The iPad was on M4 whilst the Mac Pro was (and still is) on M2.

This said, out of curiosity, what kind of performance leap in an M based Mac Pro would you need to see to even move the needle on you upgrading from the 2019 Beasts?

I don't have a 2019, but I think the MP should have upgradable GPUs and RAM, and sensibly priced SSD modules. But those all seem to be dealbreakers for Apple. It should have considerable headroom over a Studio Ultra, otherwise what's the point?

That unfortunately is the sad truth. Which is why I still hold on to the hope that they just want to show off with a seriously hardcore Mac Pro. And my ultimate wish is that for the past 5 years they've literally just been working on an entirely new way to see GPU and GPU functionality, just as they set the industry ablaze with the transition from intel to M series seemingly out of nowhere, I'm hoping a similar trick has been gestating up their sleeve for the past 5 years...

The PC workstation market gets Threadripper and Xeon platform development for 'free', as these CPUs are sold in their hundreds of thousands for server use. Apple doesn't make server hardware, so it would be like financing Threadripper development out of their own pocket, for however many Mac Pro units they sell. They'd have to be convinced that technological supremacy at the high end casts a worthwhile halo over the whole Mac range. Otherwise, they'd be better off doing something more profitable, like releasing Space Grey AirPods.

Hmmmm, this is a very good point. But when has anything they ever done been unintentional?

Quite. Where's the Extreme?

Surely they expected the blowback, and surely they expected to regain trust with the announcement of their ACTUAL 2019 MP replacement at say, the 2025 WWDC?

The 2019 was supposed to be the chosen one, to bring trust to the pro Mac market. Now they need to regain the trust they only just regained? It's too much drama. Easier to just use Windows, and only consider a switch back once Apple have demonstrated several generations of sustained interest in tower computing.
 
Last edited:
That's fair. I get that. I'm just daydreaming about the different configs that they could possibly be experimenting with over at the lab. I know they've probably ran several dozen different prototypes and likely have something jaw drop incredible over there...To be a fly on that wall...
Yeah but Apple's definition of "jaw drop incredible" is often mediocre and completely not what the people who want that sort of thing want.

Exhibit A: Apple Vision Pro.
 
Maybe the solution is that all Macs end up being low powered and anyone who needs power beyond what's needed to send email or browse the web can have a subscription "power by the hour" type arrangement. If they need extra power, they pay for it.

Apple could reduce the price of the computers slightly, but add mega-profits from the subscription model for extra processing power.
It's a bit ironic considering Apple's founding. The whole point of having an Apple II on your desk is that you could run and write your own software and use the full power of your computer without needing to pay for time-sharing on a remote mainframe. Cloud computing has its place and will possibly become more important in the future, but pushing such a thing for normal workstation (let alone consumer) use cases strikes me as a giant step backwards. I don't want to pay a subscription for things I can and should be able to run on my own computers. Also for latency sensitive use cases, such a thing would not even make sense. If Apple went this route, I would look to migrate my workflow to some variant of Linux.

Yeah but Apple's definition of "jaw drop incredible" is often mediocre and completely not what the people who want that sort of thing want.

Exhibit A: Apple Vision Pro.

The Vision Pro seems akin to the OpenDoc in that there is clearly some neat and interesting technology there, but it's not clear what problem the tech is/was trying to solve.

Hmmm...

I personally believe they haven't been WAITING so much as they haven't cracked the nut on what it is they've been trying to do. I don't think the Mac Pro should have an M series chip at all...it should be it's own thing, and I think that's what thye' been working on
If I remember, there was a rumor that the next Ultra chip was going to be a monolithic design rather than 2x Mx Max chips glued together. I wonder if they will probably make use chiplet/tiling/die-stacking tech to accomplish this and then will use an Ultra-Fusion interconnect to connect to another die mainly consisting of additional GPU cores. Like take the design of the M4 Max, remove the GPU cores and replace that with additional CPU performance cores, display controllers, and while adding ungodly PCIe bandwidth. Then the GPU would be a pee-configured tiling of 64, 128, 192, or 256 GPU cores. SOC RAM would go to 256 GB, but I would include a pool of regular RAM slots that could be expanded as high as 4TB and could function as either a hidden volume for SWAP files or function as main memory if the user desires more memory capacity over wider bandwidth and reduced latency of SOC RAM.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.