Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
I had a GeForce 3 that originally went for $500 but I think I ended up paying $150 for it. Yeah, I don't really comprehend graphics card prices anymore.
I understand what happened... RTX Titan was 2500 bucks, then crypto came (again) and nvidia dropped the Titan brand and reduced the price by a grand (USD). "Gamers" mostly ignored the Titan brand cards before nvidia did this switcheroo. The 40 series is even worse pricing wise, because instead of moving performance down the pricing stack they just changed the name of the parts and kept the price the same. Ignoring the 4090 (cause it really does have the performance to back up the dumb pricing) the market is more or less reacting to the shift as it should (the other cards are readily available and scalpers are getting hosed).
 

Confused-User

macrumors 6502a
Oct 14, 2014
852
987
Let's be clear here: I didn't say that you could build a system with the *currently shipping Mx chips* that would support PCIe GPUs. I'm saying that it's not a significant challenge for Apple to build new M2-derived (or M3) chips that would support that.

Since I made a few guesses today already, here's one more. Again, low confidence - I would not be surprised to be wrong - but I'm guessing the AS Mac Pro will be announced before or at WWDC, and it will have an M3 on "3nm" (ie, TSMC N3B).

Why? Pretty simple really. *Something* is being fabbed at TSMC on N3B right now. I don't think it's the A17 chip for the iphone 15 yet, though I could be wrong. It's possible it's an Ultra-class chip for the Studio, but I doubt that too, as N3 is NOT a die shrink, and so any such chip would be a substantially new chip, not just a shrunken M2. And it's almost certainly *something* belonging to Apple - nobody else appears to be buying those wafers.

So my guess is that we will see an M3 (though no idea what they'll actually call it) with the CPU, GPU, and NPU cores that would have been in the A16, had the A16 been able to be manufactured on N3. (And the A17 will have slightly improved versions of that core, or perhaps even the same exact core.) More interestingly, we'll finally see what Apple has decided to do in the uncore to scale up to a larger numbers of cores. Maybe most interesting of all, we'll see if they've figured out how to fix their scaling issues with the GPU running across the InFO bridge.

I suppose we might even see an implementation of TB5, though I'd bet against that.
 

Yebubbleman

macrumors 603
Original poster
May 20, 2010
6,024
2,616
Los Angeles, CA
I don't understand this statement. An eGPU is just a PCI device. Why would GPU devices be unsupported at a fundamental level? If you can write a PCI driver, you can write a GPU driver, no?
Apple doesn't support third party GPUs in Apple Silicon versions of macOS. Period. You could suppose that they don't support eGPUs on Apple Silicon Macs for this reason. And/or you could suppose that there is an additional reason for this. Either way, whether you write a driver or not, you are not getting a third party GPU to function in macOS running on an Apple Silicon Mac.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
Since I made a few guesses today already, here's one more. Again, low confidence - I would not be surprised to be wrong - but I'm guessing the AS Mac Pro will be announced before or at WWDC, and it will have an M3 on "3nm" (ie, TSMC N3B).
I agree. I think Apple almost certainly want to flip their release schedules in the same way that most other manufacturers do where the first chips released on a new architecture are the largest chips. nVidia would have a much harder time selling their 4080 and 4090 cards if they were released 6+ months after the 4060 / 4070s.

I'd be willing to put money on us never getting 3rd party GPUs, but I agree that it would be possible if the will was there.
 

Confused-User

macrumors 6502a
Oct 14, 2014
852
987
I agree. I think Apple almost certainly want to flip their release schedules in the same way that most other manufacturers do where the first chips released on a new architecture are the largest chips. nVidia would have a much harder time selling their 4080 and 4090 cards if they were released 6+ months after the 4060 / 4070s.
That's not actually true in general - for example, Intel Xeons have cores that lag their consumer chips by anywhere from six months to more than a year, historically (once even 2 years, IIRC). They also have released laptop chips before desktops of the same core generation a number of times recently. Overall it's a mixed bag.

Edit to add: I think Apple's *marketing* team would love to flip their release schedule. I think that might, long-term, be a difficult challenge on the technical side. It means they can't stagger development of new cores with new larger uncore and other extra requirements of larger SoCs. If it does happen the way you and I think it will this year, that's most likely because they're simply riding on development work that was previously done for an expected N3 chip shipping last year, which never happened.

It's not impossible that it will play out as you say in the future, but I think it's unlikely.
 
Last edited:

Confused-User

macrumors 6502a
Oct 14, 2014
852
987
Apple doesn't support third party GPUs in Apple Silicon versions of macOS. Period. You could suppose that they don't support eGPUs on Apple Silicon Macs for this reason. And/or you could suppose that there is an additional reason for this. Either way, whether you write a driver or not, you are not getting a third party GPU to function in macOS running on an Apple Silicon Mac.
None of this matters the least little bit. Everyone's always talking about how Apple owns the whole stack, software and hardware. Have you forgotten?

If they want to support off-SoC GPUs (which is not, BTW, the same as "eGPU"- you're using that term incorrectly) then they will. That's among the smallest challenges they'd face building a Mac Pro.
 
  • Like
Reactions: jdb8167

jmho

macrumors 6502a
Jun 11, 2021
502
996
That's not actually true in general - for example, Intel Xeons have cores that lag their consumer chips by anywhere from six months to more than a year, historically (once even 2 years, IIRC). They also have released laptop chips before desktops of the same core generation a number of times recently. Overall it's a mixed bag.
Yes, obviously it's a trade-off where if the chip isn't ready then it isn't ready, but from a purely marketing perspective it's much better if your most expensive hardware isn't a generation behind your cheaper SKUs, especially when selling to consumers.

Apple's current situation works against itself where if you were in the market for an M1 Max Studio you might end up being "downsold" to an M2 Pro Mac Mini.
 

Yebubbleman

macrumors 603
Original poster
May 20, 2010
6,024
2,616
Los Angeles, CA
None of this matters the least little bit. Everyone's always talking about how Apple owns the whole stack, software and hardware. Have you forgotten?

Man, am I the only one watching Apple's WWDC videos? x86 macOS supports AMD GPUs (whether external via Thunderbolt or internal via PCIe). Apple Silicon macOS does not. As in you can write an ARM64 driver for your AMD GPU and attempt to install said driver and the video card will not function. Call that "Apple owning the whole stack" or whatever. I don't really care. I'm saying that they designed THAT OS and THAT hardware architecture so as to not support third party GPUs. It would stand to reason that if that wasn't the case, we'd already have people getting AMD GPUs to work with Apple Silicon Macs. However, unless I missed something, that hasn't happened.

If they want to support off-SoC GPUs (which is not, BTW, the same as "eGPU"- you're using that term incorrectly) then they will. That's among the smallest challenges they'd face building a Mac Pro.
And they've stated repeatedly that they do not want to support off-SoC GPUs. Furthermore, the person I was replying to was making a distinction between eGPUs and PCIe GPUs. You needn't insult my intelligence by assuming I don't know the difference between the two. Incidentally distinction doesn't really matter in this context because third party GPUs are not happening on an Apple Silicon Mac anytime soon, if ever.
 

Curry119

macrumors member
Jan 11, 2023
50
45
I'm not the target demographic for a Mac Pro computer but it seems like those that are will need to decide if the added premium of non-upgradable components is a dealbreaker for them. I'm guessing it's not for many or Apple wouldn't bother releasing the product. I will say that I personally would not bother with any of the other Apple desktops at this point because none of the components in them are replaceable. I think Apple could have easily embraced AMD chips for their Macs and gotten similar performance as the M but their desire to control and monetize everything proved too great as usual. I find their direction with Apple silicon to be a net negative for consumers like myself that know how to upgrade a computer and usually get additional years of longevity as a result.
 
  • Like
Reactions: Yebubbleman

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Apple doesn't support third party GPUs in Apple Silicon versions of macOS. Period. You could suppose that they don't support eGPUs on Apple Silicon Macs for this reason. And/or you could suppose that there is an additional reason for this. Either way, whether you write a driver or not, you are not getting a third party GPU to function in macOS running on an Apple Silicon Mac.

I think it makes sense to distinguish between the policy and the technical underpinning. The way I understand it you talk about policy: Apple does not want to allow third-party GPUs in their new Macs (and I agree with this interpretation). But it's not like Apple's hardware is fundamentally incompatible with third-party GPUs. Sure, right now third-party GPUs won't work due to the limitations on how device memory mapping works (which is most likely a security feature, not a way to sabotage GPUs), not to mention that macOS does not contain any provisions for registering a custom GPU driver. But if Apple were to change their policy about third-party GPUs (which they probably won't), fixing these things shouldn't be too difficult.

In other words, this is about policy, not about technology.


I think Apple could have easily embraced AMD chips for their Macs and gotten similar performance as the M but their desire to control and monetize everything proved too great as usual. I find their direction with Apple silicon to be a net negative for consumers like myself that know how to upgrade a computer and usually get additional years of longevity as a result.

On laptops? No way. AMD improved a lot in the recent years and they have overtaken Intel on the efficiency front, but they are still nowhere near Apple. Especially when you consider graphics. On desktops, sure, although not in the same form factor. At any rate, it is not in Apple's interest to support two different hardware platforms simultaneously. If they are unable to build a high-end desktop using their own chips, it would make more sense for them to drop the high-end desktop market altogether and focus exclusively on the mobile line.
 

PauloSera

Suspended
Oct 12, 2022
908
1,393
The only things that seem safe bets to change are that RAM and graphics won't be upgradeable without replacing the SoC. But, if Apple sockets/slots that SoC and allows for such aftermarket upgrades, that'll be a trade-off considered to be acceptable by ENOUGH of that leftover market.
Yeah that's not happening. The SoC is the whole machine. You might as well swap/trade the whole machine.

Do those require more than an Ultra version Mac Studio with 128GB of RAM? I'm not an audio guy, so I can't say. But I'd imagine that there are use cases wherein even that kind of Mac Studio would fall short. Not many, but enough.

Yeah, and those few will be serviced by the Apple Silicon Mac Pro that will be basically a 2x Mac Studio.

I do think that between 2010 and 2019, Apple did lose enough high end customers such that Windows in the high-end isn't as outlandish of an idea as it might've been in 2010.

The highest end customers, aka the 2% that Mac Studio and Mac Pro AS won't address, have been floating between Mac and Windows for 15 years depending on what hardware is out that week. They don't care. They have no allegiance. Their only allegiance is to processing time.

Again, you have to remember that the Mac Pro once covered a wide range of users, so it made sense for Apple put the resources into designing this machine (including the high end aspects of it). There were plenty of desktop Mac users that were happy to buy a $2500 Mac desktop and maybe one day upgrade some RAM, but that's it. Never touched storage, never used PCI slots. Apple's new systems cover most of those users better than ever before, and its only this tiny group of people at the top that are not serviced. If Apple sees any money to be made by doing something different for these people, they might, but I sincerely doubt it. Their focus and direction with Apple Silicon is clear, and I don't see them breaking from that to make a different kind of Mac for the 2% who already float between Mac and Windows.[/QUOTE]
 
  • Like
Reactions: splifingate

vorkosigan1

macrumors member
Jan 23, 2017
71
68
I don't know who needs to hear this. Based on the hubbub on multiple Mac news sites, I'm guessing many. Here are some truths based on knowns and unknowns regarding whatever machine Apple is going to replace the Mac Pro (2019) aka MacPro7,1 with:


1. The RAM will not be separately user-upgradeable; it will be tied to the SoC.

2. There will be no PCIe GPU nor eGPU upgradability or expansion; the only GPU will be the one on the SoC.

Before anyone challenges me with that, make sure you have watched this video from WWDC 2020 (where the Intel to Apple Silicon transition was first announced): https://developer.apple.com/videos/play/wwdc2020/10686/ (at about the 1-minute mark)

3. This doesn't mean that there's no point to a Mac tower with PCIe expansion; there are plenty of professionals that need broadcast cards or special video tuners or audio interface boards in their Mac Pro; these things are not the kinds of things you can solve with Thunderbolt 4 or a Thunderbolt 3/4 breakout box. It's just not practical.

4. There's nothing in the referenced video above that negates the notion that Apple could socket the SoC and/or make it user-upgradeable/replaceable. Those of you that have used or operated a 2009-2012 Mac Pro (aka MacPro4,1 or MacPro5,1) have seen a similar concept in the form of the processor tray and backplane. There's nothing stopping Apple from doing something similar here. That's not to say that SoC upgrades likely won't cost an arm and a leg. They probably will be very expensive (assuming Apple goes this route). But it will still be possible to upgrade RAM and graphics this way.

5. The internal SSD on a 2019 Mac Pro is already proprietary, requires a DFU restore of the T2 chip in order to replace the storage; modules become useless when removed from the Mac Pro they came from; this won't be different on an Apple Silicon Mac Pro replacement either. Furthermore, if the SoC is to be user-replaceable due to being socketed or on a processor tray, the internal storage will need to be wiped when performing an SoC replacement/upgrade. This is how Apple Silicon and T2 Mac Storage works. This has no bearing on SATA or PCIe SSDs; just storage controlled by the SoC.

6. The base model SoC offered for this new Mac Pro will most likely run rings around the least expensive Mac Pro (2019) MPX AMD video card option. This is a safe bet. Less safe of a bet, but still perfectly plausible, is that it also runs rings around the MOST expensive Mac Pro (2019) MPX AMD video card option. This won't fully soften the blow of having the GPU be tied to the SoC and not upgradeable separately from it, but it will soften it for a decent amount of Mac Pro customers.

7. Apple likely won't introduce a dual-socket Apple Silicon Mac, let alone Mac Pro. This isn't a guarantee, but given everything that they said about using two discrete SoCs when first unveiling the M1 Ultra shows that they'd rather take two SoCs and bridge them internally into one mega SoC than go the dual-socket route. They could introduce a totally different technology that makes this feasible for the Mac Pro, but this seems unlikely.

8. "M2 Extreme" may have been cancelled, but it is extremely unlikely that an M2 Ultra, born out of two M2 Max SoCs with "Ultra-Fusion" will be the only SoC going into the next Mac Pro. You can customize a Mac Pro (2019) with 1.5TB of RAM. I'm sure that very few Mac Pro customers do this, but I'm also sure that there are some that do. Apple may not replace the current Mac Pro with a Mac Pro that goes all the way to 1.5TB of RAM, but it's safe to assume that they'd at least try to get halfway there. At best, an M2 Ultra, born out of the highest end M2 Max SoC times 2 would only yield 192GB of RAM. I'm not saying that isn't a ton of RAM even still. But a far cry from even half of the current Mac Pro's maximum. Let's assume that M3 Max is able to offer 128GB of RAM (by virtue of M3 being able to go to 32GB of RAM from M2's maximum of 24GB - up from M1's 16GB). That still only gives M3 Ultra a maximum of 256GB. Apple is going to continue the Intel Mac Pro's tradition of offering an entirely different class of SoC unique to Mac Pro. That's not to say that a "Max" or "Ultra" SoC won't still be on offer. That's totally possible too. There are probably many folks that would be fine with a "Max" chip's performance, but needing PCIe slots for specialized cards. But, you'd probably also have folks that would need to go to Ultra before eventually building a Mac Pro with that next level tier.

9. No, Apple hasn't forgotten about the Pros. In 2019, they released two products that all but outright admitted that they messed up. One was the current Mac Pro. The other was the first and last Intel 16-inch MacBook Pro (the first Mac since the butterfly keyboard to not have a butterfly keyboard and to be thicker than its predecessor for the sake of better performance). They did these moves for Pros. We're not getting another trash can. The "Ultra" configuration of Mac Studio is not going to be the best high-end desktop Mac that Apple is going to offer. You won't see regular upgrades to the Mac Pro. And, per that video linked above (which is to say "per how Apple Silicon is fundamentally designed as a Macintosh hardware platform"), you will not have the level of easy aftermarket upgradeability you had with the 2019 (let alone 2009-2012) Mac Pro. But it ought to still be a decent upgrade and not a trash can upgrade.
You can't base "truths" on "unknowns". Please.
 

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
whether you write a driver or not [...]

It IS possible to get third-party GPUs running on Apple Silicon if you write drivers for it. The only problem is that it's too hard. The guys who wrote the drivers for Asahi Linux said that themselves.
 

darngooddesign

macrumors P6
Jul 4, 2007
18,362
10,114
Atlanta, GA
I don't know who needs to hear this. Based on the hubbub on multiple Mac news sites, I'm guessing many. Here are some truths based on knowns and unknowns regarding whatever machine Apple is going to replace the Mac Pro (2019) aka MacPro7,1 with:


1. The RAM will not be separately user-upgradeable; it will be tied to the SoC.

2. There will be no PCIe GPU nor eGPU upgradability or expansion; the only GPU will be the one on the SoC.

Before anyone challenges me with that, make sure you have watched this video from WWDC 2020 (where the Intel to Apple Silicon transition was first announced): https://developer.apple.com/videos/play/wwdc2020/10686/ (at about the 1-minute mark)
Remember when the SSD was always on the SoC until it wasn’t in the Mac Studio. The video only talked about single SoCs but here we are with the Dual-M1Max Ultra in the Studio.
 
Last edited:

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
Wasn't part the problem that Apple doesn't support some types of device memory mapping that a GPU driver would need?

No, not as far as I remember. You can code the memory mapping by hand. But that's hard even for a seasoned developer.

If APPLE does it, however... well, they have access to the whole specs.
 
  • Like
Reactions: AdamBuker

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Fair enough, but it went from being hardwired to the logic board to separate with a connector with no reduction in performance. I think Apple is going to design a high-bandwidth connector to allow for things like GPUs.

Possible, but I wouldn’t bet my money on it. SSD is not a high-bandwidth component by a wide margin, it’s a fairly simple exercise to make it slotted (not to mention that Apple has fixed that issue a long while ago). What I’m trying to say that it’s premature to draw these parallels as the problem is very different.
 

darngooddesign

macrumors P6
Jul 4, 2007
18,362
10,114
Atlanta, GA
Possible, but I wouldn’t bet my money on it. SSD is not a high-bandwidth component by a wide margin, it’s a fairly simple exercise to make it slotted (not to mention that Apple has fixed that issue a long while ago). What I’m trying to say that it’s premature to draw these parallels as the problem is very different.
Very true. I just think that contrary to the OP’s three year old video, which didn’t even mention fusing two M1-Maxs together, Apple is capable of supporting internal expandability in the way that Mac Pro users want.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
If you think about it Apple could possibly offer upgradeable RAM if they offer the SoC as a user upgradebale part. 😆

Wonder how much that would cost? 😜
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
Possible, but I wouldn’t bet my money on it. SSD is not a high-bandwidth component by a wide margin, it’s a fairly simple exercise to make it slotted (not to mention that Apple has fixed that issue a long while ago). What I’m trying to say that it’s premature to draw these parallels as the problem is very different.
Much more likely that they designed the integrated SSD controller and flash memory for low power. Performance is not a consideration.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Much more likely that they designed the integrated SSD controller and flash memory for low power. Performance is not a consideration.

There are likely two factors driving Apple with the SSD Module design being used on higher end desktop Macs.


1. Costs. The more expensive the board to replace , the more they will save themselves money in repair inventory parts inventory costs and on 'damage' to AppleCare revenue ( hits on insurance pay offs ) with a more cost effective solution. It is Apple saving themselves as much as anyone else.

iMac Pro -- relatively very expensive GPU and VRAM soldered to the board. One NAND chip goes back and have to replace the whole board. Versus, two new NAND SSD Modules. ( in 'SSD Modules' the 'SSD' is an adjective, not a noun. Those are not SSDs. They are modules that belong to a single SSD. The 'NAND chip daughtercard is being replaced. The SSD is not. )

Mac Pro 2019 -- similar. NAND chip package dies and expensive Server grade , dual input PLEX PCI switch and very expensive double sided , super sized motherboard gone. Versus two new NAND SSD Modules.

iMac 2020 -- not quite as an expensive board. Solder from 128G - 2 TB. Using one NAND module to go from 4-8TB. Statistically when using substantively more NAND modules on the board shift the risk profile so that some hits are to a module ( and not whole motherboard).

Mac Studio Ultra -- see above iMac Pro case. Effectively, the same situation.


Back in 2017 era there was also a cost factor where Apple could get to higher SSD capacities by just using more NAND chips (and just easier to use two modules of them).

2. Decommision data retention policy. Some places require that a disk be pulled before 'retire' any system from some service on sensitive data. Maybe it will get used by someone else, but certainly won't be using those same drive(s).

If every single system in Apple's Mac line up had soldered on drivers some folks would stop buying.


3. Wear and likely user workloads. The one and only one internal drive 'mindset' that Apple fosters tends to drive some folks to hoard every last drop of data all onto one drive. It is a bit goofy to mix highly mutating intermediate fiels with those with average or worse zero change rate all onto one SSD. Don't really need a 1DWPD drive to store pictures archive that is likely never going to change. It is just two different workloads.

Short and intermediate term can get along but over the very long term it can bring higher failure rates.





Low power really isn't a issue either way. In either case the SSD controller and the Secure element ( encryption) are all sharing the same RAM over a shorter distance. That's less power right there. The NAND chips on a daughtercard or soldered to the board are going to use the same amount of power. The NAND chips are always off the package (and die) either way. The daughter card solution is incrementally further away but now down in the 'diminishing returns' of power savings. ( if could find NAND chips that consumed lower power , that would have a bigger impact than removing the daughter card socket transition 'losses'. )


The bigger issue with most Apple designs is the "thinness politburo' mandates for minimum z-height. Soldered to the main board is shorter, even if less volume efficient. iMac , Studio , MP don't have quite as rigid of a 'thin' mandate.
 
  • Like
Reactions: Confused-User

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Remember when the SSD was always on the SoC until it wasn’t in the Mac Studio. The video only talked about single SoCs but here we are with the Dual-M1Max Ultra in the Studio.

There is nothing particularly 'new' in the Mac Studio's SSD implementation at all. The SSD controller is still in the SoC across all the M-series implementations. Just like it was always in the T2 previously. The replaceable parts inside the Mac Studio is not SSDs. They are SSD modules. Those are effectively just "NAND chip daughtercards". There is just a subcomponent of a SSD on those. The 'brains' of the SSD is in the main chip. Just like the other implementations.
( in 'SSD module' the 'SSD' is an adjective; not a noun. The thing being plugged in a module , not an SSD. ).

The basic principles for this dual module systems were intially rolled out in an iMac Pro in 2107. That is almost four whole years before the Studio showed up. That approach was "old news" but 2021. The Mac Pro 2019 had it. The 27" iMac 2020 sometimes had 'half' of it ( for capacities over 2TB ). The Mac Studio was the fourth system to trot this out.

The second SSD controller, second Secure Element , and a few other subcomponents are just 'dead weight' in a dual die M1 configuration in an Ultra. There is no there, there.

As the motherboards with a relatively large set of expensive stuff gets increasingly more expensive it makes sense to modularize NAND chips since they do wear out with use. The whole reason you need a 'smart controller' with brains to manage where to juggle putting the data on the NAND chips is just an inherent liability with NAND based SSDs.
 
  • Like
Reactions: sam_dean

Confused-User

macrumors 6502a
Oct 14, 2014
852
987
Yeah that's not happening. The SoC is the whole machine. You might as well swap/trade the whole machine.
I agree that that's an unlikely choice for Apple to make, but it's not crazy. Numerous other machines have been designed that way in the past, including at least one by Apple (the 2006-2010 Mac Pro, with the CPU/RAM daughterboards).
[...] Their focus and direction with Apple Silicon is clear, and I don't see them breaking from that to make a different kind of Mac for the 2% who already float between Mac and Windows.
Maybe. It's a good argument, but there's a good argument in the other direction: Halo products affect mindset and steer sales.

The only way we're going to know how Apple decides is to see what they introduce this year.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Fair enough, but it went from being hardwired to the logic board to separate with a connector with no reduction in performance. I think Apple is going to design a high-bandwidth connector to allow for things like GPUs.

You mean something like two x16 PCI-e v4 lane clusters. Probably. But that isn't some 'moon shot' project. Especially for a for something that is suppose to be a Mac Pro.

GPUs can be entirely put aside. If just want multiple high performance internal drives it is pretty much a requirement. There whole "one , only one" internal drive limitation is extremely dubious , if not ludicris, in moderate to heavy workstation context. Even more so than not trying to get into a 'pissing' match with Nvidia's top of the line GPU.
Need to load data into the RAM to do something with it. And there is only so much data can put onto a single SSD.

Leaning 'too hard' on Thunderbolt is an openly admitted mistake by Apple. The Studio paints itself into a corner with that limitation as move to higher end workload mixes.

I think Apple is going to be somewhat reluctant to put that on the main die though. A major aspect of the M-series is 'going to war' against discrete GPUs (dGPUs). Apple is out to remove them from as many Mac systems as possible. The Mac Studio is well inside the scope they are targeting , so that largely just leaves one 'fringe' Mac system left. (plus whatever they trickle down at slower throughput via very substantially smaller TBv4 bandwidth via external PCI-e expansion box/system. )

Apple's goal to shrink dGPU usage as small as possible doesn't help make PCI-e add-in GPUs a stable primary objective point for Mac ecosystem. Every M-series iteration Apple will be focusing on making it smaller and smaller market. Which is quite likely going to make it less and less interesting for others to jump into. Especially at 'race to the bottom' pricing. The pricing won't be like the other Windows driver mainstream ecosystem at all.

And that will be the problem because being many of the folks "up in arms" and grumbling about dGPU on M-series is an implicit price anchoring on the mainstream Windows card market pricing. A significant subset of folks want commodity more than modular. Modular is just a highly desired mechanism to drive down prices; but it is the prices that are the real priority.

Doubtful that Apple is going to be putting a high priority on the most affordable dGPGPUs if they bring them.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.