Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Colstan

macrumors 6502
Jul 30, 2020
330
711
I've seen this question asked multiple times, so I'll just respond with the same answer I've given previously.

I was curious about the question of expandability, specifically in regards to the Apple Silicon Mac Pro, so I decided to ask someone who actually understands CPU architecture and design. Former Opteron architect, Cliff Maier, who knows the engineers at Apple from his time at AMD and Exponential, and talks with his old colleagues regularly, had this to say. When I asked him about the Apple Silicon Mac Pro he replied with the following:

It’s possible that apple allows slotted ram and puts its own gpu on a separate die, sure. But if it does that it will still be a shared memory architecture. I would say there’s a 1 percent chance of slotted RAM. An independent GPU is more likely; the technical issues with that are not very big, but the economics don’t make much sense given apple’s strategy of leveraging its silicon across all products. Still, I’d give that a 33 percent chance. And it wouldn’t be a plug in card or anything - just a separate GPU die in the package using something like fusion interconnect. Maybe for iMac Pro, Mac studio and Mac Pro.

So, he believes a 1% chance of DIMMs, and 33% chance of discrete GPU but still on-package, just not integrated in the SoC.

Replying to my followup question about the GPU being third-party or Apple designed, his response was:

Yeah, definitely their own design. I’m quite convinced they like their architecture, and that they have been working on ray tracing. Given how parallelizable GPU stuff is, it’s quite possible that they simply put together a die that is just made up of a ton of the same GPU cores they have on their SoCs. You could imagine that, for modular high end machines, instead of partitioning die like: [CPU cores+GPU cores][CPU cores+GPU cores]… it may make more economic sense to do [CPU cores][CPU cores]…[GPU cores][GPU cores]…. (Or, even, [CPU cores+GPU cores][CPU cores+GPU cores]…[GPU cores]…

As far as the economics are concerned:

It may also make more engineering sense, in terms of latencies, power supply, and cooling, too. Of course, Apple wouldn’t do that if it was only for Mac Pro (probably) because the economies of scale wouldn’t work (plus, now, supply chains are fragile). They might do it if it made sense to use this type of partitioning for iMacs, iMac Pros, Studios, Mac Pros, and maybe high end MacBook Pros, while using the current partitioning for iPads, iPhone Pros (maybe), Mac Minis, MacBook Pros, MacBooks, and maybe low end iMacs.

Not saying they will, but at least i give it a chance. More of a chance than RAM slots or third-party GPUs.

So, according to this accomplished CPU architect, if Apple does include a GPU alongside the SoC, it's going to be their own design, not AMD or Nvidia, won't be available with add-on boards, and Apple will only implement it if they can leverage it in multiple products.

If you want further clarification, feel free to ask him yourself, he's quite chatty and answers all questions. If the words of a veteran CPU architect aren't sufficient, the engineer who wrote the draft for x86-64, then I don't know what else to say.

From my limited viewpoint, I tend to agree with Maier and @leman. We won't be seeing third-party GPUs for Apple Silicon. If there is a discrete GPU, then it will come directly from Apple, and be different from the model we are familiar with in x86 designs. We've become so accustomed to the way things are done in x86 land that it is difficult to fathom a different approach, but Apple is a company that is willing to forge its own path, if they see benefit in doing so.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
One of the main advantages of Apple GPUs over Nvidia GPUs is more RAM. A dedicated GPU will kill this advantage.
Who is going to buy a dedicated GPU over the Mx Ultra if it is less RAM?
 

l0stl0rd

macrumors 6502
Jul 25, 2009
483
416
One of the main advantages of Apple GPUs over Nvidia GPUs is more RAM. A dedicated GPU will kill this advantage.
Who is going to buy a dedicated GPU over the Mx Ultra if it is less RAM?
That is only an issue if you need more then 24 Gb and the new RTX 6000 will be available with 48 GB 🤷‍♂️
The new RTX 6000 ada lace probably will be like 5k however.
 
  • Like
Reactions: Xiao_Xi

Pressure

macrumors 603
May 30, 2006
5,179
1,544
Denmark
Instead of joining two Mx SoCs, Apple could join a CPU and GPU as AMD and Nvidia will do for their data center APUs.
Apple clearly have the interconnect, so I can totally see them having the ability to extend the UltraFusion "glue" to add GPU-centric dies, perhaps even with added memory channels.

The current 2.5TB/s memory bandwidth of UltraFusion isn't going to be a limiting factor at this point.
 
  • Like
Reactions: wyrdness

Xenobius

macrumors regular
Dec 10, 2019
190
474
So, he believes a 1% chance of DIMMs.
He missed one important thing – current Apple Silicon in fact uses a 'DIMMs' in the form of a dedicated part of the SSD (Swap). The SSD is treated by macOS as a part of system memory, e.g. you can render 3D scenes that does not fit in the main memory and it does not need any action from the developer.
Apple in AS Mac Pro can use unified memory as a main part and additional user expandable memory in DIMM slots as a fast 'mass storage/swap' and SSD would be the third memory level.
 

neinjohn

macrumors regular
Nov 9, 2020
107
70
I am unable to find the source but one of main/chief designer for Apple's GPU effort touted dedicated graphics as a big mistake to be continued by Nvidia. Same guy spend almost two decades on Nvidia.

Now you can assume he has something personal against Nvidia but given his positioning on Apple and on the topic I think it is very unlikely Apple will go for traditional dGPU on anything they build.
 
Last edited:

jav6454

macrumors Core
Nov 14, 2007
22,303
6,264
1 Geostationary Tower Plaza
Nvidia didn't do this. If anything, you should blame greedy AIBs like Evga instead.

In fact, Nvidia doesn't even want to sell to miners if they could choose who buys it. This is because miners inevitably flood the market with used GPUs when crypto goes down like it's now.

Nvidia selling directly to miners is just a myth created by gamers who hate how expensive GPUs have gotten.
I fail to see any evidence how nVidia sold only to gamers. I can only say NewEgg did anything for gamers during this GPU crunch.
 

mi7chy

macrumors G4
Oct 24, 2014
10,621
11,294
What market is there for such a thing? Proprietary single platform, probably costs close to 2 x Mac Studio M1 Ultra, requires a $6K+ Mac Pro chassis to plug into, slower than $900 3080ti, lacking in software support, etc. Now is the worst time to get into the GPU market. Look at what happened to Intel Arc.
 
Last edited:

Colstan

macrumors 6502
Jul 30, 2020
330
711
I am unable to find the source but one of main/chief designer for Apple's GPU effort touted dedicated graphics as a big mistake to be continued by Nvidia. Same guy spend almost two decades on Nvidia.
You're thinking of this.

gpu.jpg


More of his thoughts in the thread I linked. Keep in mind that this guy is Apple's Director of GPU Architecture. He's currently roasting Nvidia's new space heaters on Twitter. Yet, despite what the guy who is literally in charge of Apple Silicon's GPU is saying, some still insist that Apple is going to use third-party GPUs.

It's like the people who claim that Boot Camp is coming back, even though the guy who is in charge of that says it's not coming back. The amount of wish casting is remarkable, and then some people get angry when the genie doesn't appear. Apple literally tells us their plans, yet some folks don't like the answers, so they fill in blanks because it doesn't fit the narrative that they personally desire. Hope springs eternal, I suppose.

For better or worse, Apple is going all-in on their own solutions, we can either adapt to that new reality, or plug our ears to what is, in my opinion, plainly obvious using Apple's own statements from the executives in charge of these projects. I don't necessarily agree with all of Apple's decisions, but I'm realistic enough to realize where they are headed, and plan accordingly.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command

Yup, not everyone can drop three-phase power in to run an Onyx supercomputer...!

I knew a guy who did just that (along with a three ton AC system for a room that was just over 120 sq. ft.) to run his fridge-sized Silicon Graphics Onyx, he had a small indie vfx shop in the early 1990's...

More of his thoughts in the thread I linked. Keep in mind that this guy is Apple's Director of GPU Architecture. He's currently roasting Nvidia's new space heaters on Twitter. Yet, despite what the guy who is literally in charge of Apple Silicon's GPU is saying, some still insist that Apple is going to use third-party GPUs.

I am in the "Apple GPU/GPGPU" camp; not third-party, just more of the same ASi GPU cores on an add-in card...

The amount of wish casting is remarkable, and then some people get angry when the genie doesn't appear. Apple literally tells us their plans, yet some folks don't like the answers, so they fill in blanks because it doesn't fit the narrative that they personally desire. Hope springs eternal, I suppose.

I would think a good part of it is the lack of any type of road map for Apple hardware & the absolute dearth of solid rumors...

For better or worse, Apple is going all-in on their own solutions, we can either adapt to that new reality, or plug our ears to what is, in my opinion, plainly obvious using Apple's own statements from the executives in charge of these projects. I don't necessarily agree with all of Apple's decisions, but I'm realistic enough to realize where they are headed, and plan accordingly.

I am truly excited to see where Apple might go in the high-end DCC field, especially once software (looking at you 3D apps) learns how to "play nice" with the ins and outs of the Apple ASi/Metal pipeline...
 

singhs.apps

macrumors 6502a
Oct 27, 2016
660
400
Yup, not everyone can drop three-phase power in to run an Onyx supercomputer...!

I knew a guy who did just that (along with a three ton AC system for a room that was just over 120 sq. ft.) to run his fridge-sized Silicon Graphics Onyx, he had a small indie vfx shop in the early 1990's...
Man. SGIs.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
I fail to see any evidence how nVidia sold only to gamers.
They didn't just sell to gamers. Miners bought Nvidia GPUs too. It doesn't mean you can blame Nvidia for that. Miners just bought through retail like any gamer. In fact, there have been reports of AIBs selling directly to miners.
 
Last edited:

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
I've seen this question asked multiple times, so I'll just respond with the same answer I've given previously.

I was curious about the question of expandability, specifically in regards to the Apple Silicon Mac Pro, so I decided to ask someone who actually understands CPU architecture and design. Former Opteron architect, Cliff Maier, who knows the engineers at Apple from his time at AMD and Exponential, and talks with his old colleagues regularly, had this to say. When I asked him about the Apple Silicon Mac Pro he replied with the following:



So, he believes a 1% chance of DIMMs, and 33% chance of discrete GPU but still on-package, just not integrated in the SoC.

Replying to my followup question about the GPU being third-party or Apple designed, his response was:



As far as the economics are concerned:



So, according to this accomplished CPU architect, if Apple does include a GPU alongside the SoC, it's going to be their own design, not AMD or Nvidia, won't be available with add-on boards, and Apple will only implement it if they can leverage it in multiple products.

If you want further clarification, feel free to ask him yourself, he's quite chatty and answers all questions. If the words of a veteran CPU architect aren't sufficient, the engineer who wrote the draft for x86-64, then I don't know what else to say.

From my limited viewpoint, I tend to agree with Maier and @leman. We won't be seeing third-party GPUs for Apple Silicon. If there is a discrete GPU, then it will come directly from Apple, and be different from the model we are familiar with in x86 designs. We've become so accustomed to the way things are done in x86 land that it is difficult to fathom a different approach, but Apple is a company that is willing to forge its own path, if they see benefit in doing so.
So basically exactly what some of us have been saying.

1. Economy of scale provides a low to zero chance of Apple creating a custom SoC specifically for the Mac Pro. Instead, they will glue 4 Max dies together.

2. Little to no chance of a discrete AMD GPU or even an Apple GPU. Unified memory architecture is too good to give up and AMD cards not worth it to give up UMA.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
That is only an issue if you need more then 24 Gb and the new RTX 6000 will be available with 48 GB 🤷‍♂️
The new RTX 6000 ada lace probably will be like 5k however.
No one is going to buy a Mac Pro with only 24GB of RAM that comes with an RTX 6000 with 48GB of VRAM.

In addition, if the regular M2 is any indication of RAM support, then the Mac Pro SoC could support up to 384GB of unified memory which is 8x more than 48GB.
 
Last edited:

Colstan

macrumors 6502
Jul 30, 2020
330
711
So basically exactly what some of us have been saying.

1. Economy of scale provides a low to zero chance of Apple creating a custom SoC specifically for the Mac Pro. Instead, they will glue 4 Max dies together.

2. Little to no chance of a discrete AMD GPU or even an Apple GPU. Unified memory architecture is too good to give up and AMD cards not worth it to give up UMA.
Yes, exactly. What a lot of people keep missing, I think, is that the 2019 Mac Pro is a result of Intel's design philosophy, not Apple's. Apple designed the case and the MPX modules, but most everything else is a result of the Xeon platform. Xeons aren't huge volume, but they are mass-produced, and Apple took advantage of Intel's economics of scale.

It's obvious math that there are more Xeons being sold than Mac Pros, substantially more. Once the Apple Silicon Mac Pro is released, it's going to be a niche of a niche. It makes zero economic sense to make an SoC specifically for that tiny sliver of the Mac market, keeping in mind that the Mac is about 9% of Apple's total revenue, and most of that is from laptop sales. The Mac Pro is barely a rounding error on a spreadsheet.

percent.jpg


Apple also isn't giving up on UMA just for the Mac Pro (and maybe a theoretical iMac Pro). Apple's GPU chief is openly mocking discrete GPUs on Twitter, so I think it's fairly obvious where they stand on the matter.

My expectation is, as you say, what a lot of folks have been thinking. It's going to be an M2 "Extreme", essentially a doubling of the M2 Ultra in the next Mac Studio, with a handful of PCIe slots for non-GPU tasks. The modularity of the Mac Pro is going to be significantly reduced, because that is a result of Apple's design philosophy. The Xeon days are over, and they ain't coming back.

...and I say this as somebody who bought a 2019 Mac Pro two weeks ago and has a 6900XT on the way to put inside of it.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,674
Also RT, faster memory for graphics...

What’s stopping Apple from adding their own hardware RT? As to faster memory for graphics… for pro apps, UMA and overall memory capacity are much more important and for games bandwidth is fine as it is thanks to TBDR.

When you look at it in detail, dGPUs only redeeming point is lower price. Which is achieved by sacrificing power efficiency and performance. And then there is the issue with interconnect which again makes things more complicated….


Why is it so? AMD is not a bad partner.

For the reasons previously outlined. There is no sense whatsoever for Apple to fragment the ecosystem compatibility like this. They put a lot of effort in designing a specific GPU software model and AMD is not compatible with that model.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
Once the Apple Silicon Mac Pro is released, it's going to be a niche of a niche. It makes zero economic sense to make an SoC specifically for that tiny sliver of the Mac market, keeping in mind that the Mac is about 9% of Apple's total revenue, and most of that is from laptop sales. The Mac Pro is barely a rounding error on a spreadsheet.

Going the mixed SoC route (one standard CPU/GPU SoC, one GPU-specific SoC) gives the end user the option to skew the CPU:GPU ratio; Mn Ultra with either two Mn Max SoCs, or one Mn Max SoC and one GPU-specific SoC...

This mixed SoC option can also be utilized for the Mn Ultra Mac Studio, so more than just the Mac Pro to use these GPU-sepcific SoCs in...

Apple also isn't giving up on UMA just for the Mac Pro (and maybe a theoretical iMac Pro). Apple's GPU chief is openly mocking discrete GPUs on Twitter, so I think it's fairly obvious where they stand on the matter.

If Apple can figure out a way to have an add-in GPU/GPGPU card contribute to the overall "iGPU" while still retaining the whole UMA thing, that would be great...

But if not, then add-in Apple GPGPU (two or four GPU-specific SoC options) for running renders, simulations, etc. while the end user still has the full power of the iGPU to work with in their DCC apps...
 
  • Like
Reactions: singhs.apps

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
Yes, exactly. What a lot of people keep missing, I think, is that the 2019 Mac Pro is a result of Intel's design philosophy, not Apple's. Apple designed the case and the MPX modules, but most everything else is a result of the Xeon platform. Xeons aren't huge volume, but they are mass-produced, and Apple took advantage of Intel's economics of scale.

It's obvious math that there are more Xeons being sold than Mac Pros, substantially more. Once the Apple Silicon Mac Pro is released, it's going to be a niche of a niche. It makes zero economic sense to make an SoC specifically for that tiny sliver of the Mac market, keeping in mind that the Mac is about 9% of Apple's total revenue, and most of that is from laptop sales. The Mac Pro is barely a rounding error on a spreadsheet.

View attachment 2080513

Apple also isn't giving up on UMA just for the Mac Pro (and maybe a theoretical iMac Pro). Apple's GPU chief is openly mocking discrete GPUs on Twitter, so I think it's fairly obvious where they stand on the matter.

My expectation is, as you say, what a lot of folks have been thinking. It's going to be an M2 "Extreme", essentially a doubling of the M2 Ultra in the next Mac Studio, with a handful of PCIe slots for non-GPU tasks. The modularity of the Mac Pro is going to be significantly reduced, because that is a result of Apple's design philosophy. The Xeon days are over, and they ain't coming back.

...and I say this as somebody who bought a 2019 Mac Pro two weeks ago and has a 6900XT on the way to put inside of it.
I don't know whether or not they will etch a different set of dies for the Mac Pro—e.g., to quote Cliff: [CPU cores][CPU cores]…[GPU cores][GPU cores]—as opposed to merely making it a 2x Ultra. [Though note that, by definition, even the latter still necessarily entails a different, larger SoC with more memory chips.]

But (and consistent with Cliff's notably large "33%" estimate), I think their chance of doing this is higher than a pure market share analysis would lead one to suspect, because the Mac Pro is Apple's halo product, and also its biggest connection to the pro creative community (which they've been trying to keep much happier than they have in the past). Thus Apple may see its development costs as part of their marketing budget.

Plus while the percentage of Mac Pro buyers is tiny, it's out of a huge pie. That little 9% Mac division sliver would, by itself, be a Fortune 100 company.
 
Last edited:
  • Like
Reactions: altaic

leman

macrumors Core
Oct 14, 2008
19,521
19,674
That sooner won't happen.

It will happen either this year or the next. And there won’t be any Apple branded dGPUs. Apples game US Integration, not separation. Apple Silicon is following the forward-thinking hardware design model, dGPUs are products from the 90-es hardware design.
 
  • Like
Reactions: AlexMac89

Xenobius

macrumors regular
Dec 10, 2019
190
474
It will happen either this year or the next. And there won’t be any Apple branded dGPUs. Apples game US Integration, not separation. Apple Silicon is following the forward-thinking hardware design model, dGPUs are products from the 90-es hardware design.

The problem is that this ancient way of designing leaves modern Apple in a puff of dust. For the end user, what matters is how quickly the job is done.
 
  • Haha
Reactions: AlexMac89

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
One way Apple might branch the Mac SoCs away from the iDevices (i.e. Axx) would be to have the GPU and the Rest (CPU, NPU, Media Encoders, etc) on separate dies and connect both dies together with UltraFusion. Depending on product segment, the GPU and Rest dies can be as big as they want it, each with their own memory controllers and caches. The larger the number of cores, the more memory controllers and thus larger memory. Could even be DIMM slots for the Mac Pros.

IMHO, doing it via slot-in cards (i.e. traditional dGPU) will not be possible due to the high bandwidth requirement between the card and the SoC. I'm not aware of any slots that allows > 2TB/s like UltraFusion.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,674
The problem is that this ancient way of designing leaves modern Apple in a puff of dust. For the end user, what matters is how quickly the job is done.

Only because Apple GPUs are smaller and lower clocked. And then of course you have things like hardware RT and software optimization. But there is nothing preventing Apple from making a higher clocked, larger GPU. It’s the same thing really, if Nvidia can build a huge GPU and give it a 2+ghz clock why can’t Apple?

M1 is low-power technology without much vertical scalability. We’ll have to wait and see where Apple goes from here. It Is entirely possible that upcoming iterations of Apple Silicon will allow more vertical scalability from the same design which should address the performance disparity concern.
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,663
OBX
They didn't just sell to gamers. Miners bought Nvidia GPUs too. It doesn't mean you can blame Nvidia for that. Miners just bought through retail like any gamer. In fact, there have been reports of AIBs selling directly to miners.
Nvidia did try to steer mining to CMP cards, via making all the "newer" 30-series units LHR cards (via hardware/software locks).
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
Nvidia did try to steer mining to CMP cards, via making all the "newer" 30-series units LHR cards (via hardware/software locks).
This wasn't the accusation made here. The charge is that Nvidia focused on selling their gaming GPUs to miners over gamers. It's not true.

Nvidia tried everything it could to limit miners from buying their gaming GPUs. Did Nvidia profit from miners? Absolutely. Miners pushed all GPU prices to the stratosphere for 2 years. AMD and Nvidia GPUs. Did Nvidia intentionally route their GPUs to miners? No. There's no evidence of this.

Why would Nvidia want to sell GPUs to miners over gamers? Miners will dump GPUs onto the used market as soon as crypto crashes, which is happening now. Gamers do not dump GPUs like that. Given a choice, Nvidia will always want to sell to gamers over miners.

Read my post here on why Nvidia tried so hard to prevent miners from buying their GPUs: https://www.reddit.com/r/stocks/comments/wjeq5b
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.