Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
From what I understand, any GPU drivers would need to deal with UEFI hooks for boot up when Apple silicon Macs don't have UEFI at all. I'm not sure how that would get resolved. Maybe it doesn't matter since the default boot display could be the internal GPU.

They don't. According to Asahi Linux developers, in theory it's perfectly possible to make eGPU drivers for Apple Silicon – by third parties, even. The only issue is that memory mapping is extremely hard even for seasoned developers.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Hehe, in my post I said it was based on C++ 17 so when I saw this I wondered which one of us was correct.

Turns out it's based on C++ 14 :D

Originally C++ 11

"...
Alright. So the Metal shading language is a unified shading language. This means using the same language, syntax, constructs, tool chains for both graphics and compute processing, and it's based on C++11, essentially a static subset of this language, and it's built out of LLVM and clang, which means you get incredibly high performance and incredibly high quality code generation for the GPU.
..."

It came out in June 2014 so would have been rather super, duper bleeding edge to be a standard that had not even passed yet.

"... Its approval was announced on August 18, 2014.[1] C++14 was published as ISO/IEC 14882:2014 in December 2014.[2] ..."

Metal has been around for almost 10 years at this point so it has evolved while tracking the C++ evolution.


At this point, Metal is pragmatically at least lightly coupled to the NS::Object frameworks.

"...
  • Open Xcode on your Mac. Xcode 9.3 or later includes C++17, which is the minimum required by Metal-cpp because of the use of constexpr in NS::Object.
..."

Of have 'headless' , completely NS::Object hierarchy framework free code base then it isn't, but but normal apps would be. That is where the '17' comes in. Certainly could not have been 17 way back in 2014 though.
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
They don't. According to Asahi Linux developers, in theory it's perfectly possible to make eGPU drivers for Apple Silicon – by third parties, even. The only issue is that memory mapping is extremely hard even for seasoned developers.
Is this recent? The Asahi Linux developers previously stated that the Thunderbolt controller software was missing a required mode for eGPU use. Maybe they figured out how to reverse engineer the appropriate mode?


Edit: Looks like Hector Martin @marcan might have deleted his twitter account. So I can't show what he posted about the PCIe Base Address Register (BAR) limitation that breaks eGPU drivers.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,521
19,675
It doesn't seem unfeasible to me that Apple could simply write AMD drivers for ASi based Macs. That's all that's really missing from those Macs at any level. Doing so would allow for eGPUs, and allow for AMD GPUs in future Mac pros. It allows them to retain their SOC architecture and so on, while allowing users to upgrade their GPUs or to have multiple GPUs should their workflow support that need.

The biggest issue with this is that the programming model and optimisation strategies are rather different between Apple GPUs and AMD/Nvidia GPUs. With Apple Silicon, all Mac GPUs have the same unified set of features and capabilities that I can rely on as programmer. Add a third-party GPU into the mix, and that ecosystem advantage is gone. Now I have to develop/optimise for different GPUs. What's worse, this actually discourages me from using unique Apple features like the superior texture compression, programmable blending, or memory-persistent compute shader invocations, because none of them are supported by third-party hardware.


It's entirely possible Apple will depart from the SOC style and break it down into modular components specifically for the modular Mac. That would support their cancellation of the M2 Extreme chip, because they realized that when you get to the high end market, your needs for more RAM, CPU cores, GPU power, etc don't scale linearly with each other. Sometimes you're a pro that needs 5x GPU power and standard CPU power, and sometimes the other way around. Why pay for systems you don't need, until you need it, when you're talking about the high end modular sector?


It is entirely possible that Apple will eventually start using more modular components, but they will still be integrated into a single package. If by "modular" you mean "user-replaceable", that's probably less likely. Apple has bet on high-bandwidth, low-latency accelerator connectivity. This is very far away from the traditional PCs with its components connected with a narrow data link.
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,606
4,113
It doesn't seem unfeasible to me that Apple could simply write AMD drivers for ASi based Macs. That's all that's really missing from those Macs at any level. Doing so would allow for eGPUs, and allow for AMD GPUs in future Mac pros. It allows them to retain their SOC architecture and so on, while allowing users to upgrade their GPUs or to have multiple GPUs should their workflow support that need.
eGPUs are terrible. Provide PCIE 5.0 extension slots.
 

SecuritySteve

macrumors 6502a
Jul 6, 2017
949
1,082
California
If by "modular" you mean "user-replaceable", that's probably less likely. Apple has bet on high-bandwidth, low-latency accelerator connectivity. This is very far away from the traditional PCs with its components connected with a narrow data link.
If it's not user replaceable in any way shape or form, then it's not the Mac Pro. If they really went that route, the high end market will either get by with Studios or move elsewhere.

I think it's awfully cynical to think Apple can't get something right after such loud and resounding feedback on exactly what the high end market wants.
 

t0mat0

macrumors 603
Aug 29, 2006
5,473
284
Home
Don’t see how the video from OP stops Apple From having eGPU.
have the RAM next to M series chip at 90° or other measure to make some room, and then allow the eGPU to connect up to the M series chip. Pretty sure there were some patents for this setup described at Patently Apple.
Something like asymmetrical multi core processing for GPU.
It’s not like they didn’t hide ultrafusion, or that they use the latest and greatest methods to improve their silicon. A Silicon interposer to allow more GPU seems feasible.

We don’t have too long to wait to find out Which is the best bit. It’s not known for sure, it’s new, and it shows Apple’s intentions for future Macs so it’s an exciting aspect of Apple Silicon.
 

t0mat0

macrumors 603
Aug 29, 2006
5,473
284
Home
“Apple's patent entitled "memory system having combined high density, low bandwidth and low density, high bandwidth memories" describes various SoCs that use high-bandwidth cache DRAM as well as high-capacity main DRAM”



easy to cherry patents - but they’ve thought of it enough from the looks of it. Nothing to stop them going this hybrid memory architecture
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
If it's not user replaceable in any way shape or form, then it's not the Mac Pro. If they really went that route, the high end market will either get by with Studios or move elsewhere.

It’s not clear to me at all that the high end market wants replaceable GPUs. I’d think the high end market wants fast GPUs. And Apples approach can potentially deliver tremendously fast GPUs with more RAM than anything on the market.

There is a good reason why Nvidias upcoming datacenter computer is a tightly coupled system with our replaceable RAM or accelerators.

I think it's awfully cynical to think Apple can't get something right after such loud and resounding feedback on exactly what the high end market wants.

Apple Silicon is already a tremendous success. I rather think it’s weird to believe that Apple would throw years of R&D overboard and go back to a hardware model they have rejected. Would make more sense for them to abandon the high end desktop altogether.
 

Philip Turner

macrumors regular
Dec 7, 2021
170
111
The push from others to weave OpenCL with C++ is another factor to why Apple is walking away. There is a Swift (and previously ObjectiveC ) agenda pushes regardless of where the rest o the HPC market is going.
SYCL 2020 doesn't have to use OpenCL backends. This is especially good for hipSYCL, where I've been drafting a Metal backend. I don't have much time, but hope to finish it next spring break or summer.
 
  • Like
  • Wow
Reactions: Romain_H and jmho

Philip Turner

macrumors regular
Dec 7, 2021
170
111
potentially deliver tremendously fast GPUs with more RAM than anything on the market.
Their M3 Extreme (hopefully) chip would support 384 GB of RAM. What's the most RAM in a GPU that doesn't cost 25,000 dollars? Wait, the M3 Extreme with full memory would cost >10,000 but let's not talk about that.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
If it's not user replaceable in any way shape or form, then it's not the Mac Pro. If they really went that route, the high end market will either get by with Studios or move elsewhere.

So in a chassis that includes PCIe slots (used for more than just GPUs), non-user replaceable parts make it a no-go; but if it is a Mac Studio, a chassis without PCIe slots and also without any user replaceable parts, then it is okay...?

Kinda contradictory, yeah...?
 

SecuritySteve

macrumors 6502a
Jul 6, 2017
949
1,082
California
So in a chassis that includes PCIe slots (used for more than just GPUs), non-user replaceable parts make it a no-go; but if it is a Mac Studio, a chassis without PCIe slots and also without any user replaceable parts, then it is okay...?

Kinda contradictory, yeah...?
Other than media cards and port extensions, I see no usefulness to the PCIe slots if you can't use GPUs in those slots. At that point, why not just get by with a studio-like design, because you gain nothing from the PCIe slots? That's why I mentioned that users would either leave to other platforms, or they would begrudgingly go with Studios with the same chip as the Mac Pro but with (probably) a lower cost.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
Other than media cards and port extensions, I see no usefulness to the PCIe slots if you can't use GPUs in those slots. At that point, why not just get by with a studio-like design, because you gain nothing from the PCIe slots? That's why I mentioned that users would either leave to other platforms, or they would begrudgingly go with Studios with the same chip as the Mac Pro but with (probably) a lower cost.
RAID storage, networking, & video I/O cards would all like more bandwidth than a TB4 port can provide; so yeah, there are things to gain from PCIe slots that are not GPUs...

Really wish Apple Aperture was still being developed into Apple Silicon. That's the sister product of FCP.

I would say Shake would be more of a sister product to FCP; hey Timmy, where is Phenomenon...?!?
 

Yebubbleman

macrumors 603
Original poster
May 20, 2010
6,024
2,616
Los Angeles, CA
Not for me, I expect I'll have to give it up eventually unless a hardware solution becomes available for running x86 VM's.

Unless you (a) want to virtualize older x86 versions of macOS and/or (b) virtualize whatever OSes you are to be virtualizing on a Mac, you'll probably always have better luck doing virtualization on a Windows or Linux PC. I have a MacBook Pro (16-inch, 2019) for satisfying both of those things, but even I know that my days for doing this are numbered.

There's just no advantage to me running it, even at home if I can't do me work on it, and for that, I need Windows. WoA really isn't good enough, so currently I have to remote into a Windows PC to do my work, and it's just a short change to just be using the Windows PC to begin with. For the kind of money I spent on my Studio Max, I could have bought a REAL nice Windows machine. I currently just use a Mac because I like them, but that only goes so far. I like Windows too. (not so high on Linux though!) I'm an outlier here, and I know it, so don't take what I say as doom and gloom for Apple. I'd much prefer if Macs and Windows machines hang around so I can buy the best machine for whatever job I need at the time.

Honestly, this is going to be my future as a Mac user too. I'll get whatever machine sits where the current Apple Silicon 13-inch MacBook Pro sits in the Mac lineup and my uses will be light to moderate with the heavier tasks (which, for me are virtualization and gaming) to be handled by Windows PCs (that are, presumably still going to be x86-64). I'd be more bummed about it if I didn't see the writing on the wall for a few years now. It's not like my professional needs mandate high-powered Macs (just maybe my recreational for-*****-and-giggles creativity). For IT and consulting (my main bread and butter), a base model M1 Air will more than suffice.

Agreed, but I'm not seeing any rumors of expansion capabilities at all other than maybe ssd's. I hope I'm wrong because like you say, it's for people that need expansion. The RAM limits I've heard don't really even cover some of the stuff I do (VM based developing/testing) I'm not the target of the Mac Pro and never have been, but I have bought hi end machines for work, so I kind of know what they buy and why.

PCIe slots. Internal SATA. Things the current Mac Pro has and that any one to come after would need, lest it be a Mac Studio. Not something you or I need in a Mac, but others surely do.


The Mac Pro served a lot of different customers in the past. From those that wanted a $2,500 Mac to a $50,000 Mac and all in between.

Today the Mac Studio is already serving a good 70% of the previous Mac Pro market. The Apple Silicon Mac Pro (which is essentially a 2x Mac Studio) will serve about 25% more.

I agree that it's a diminishing market. I also agree that the 27-inch iMac and the Mac Studio that replaced it took away a lot of that market. But I think there's enough of a market that still needs a Mac Pro tower and that, despite how much smaller that market is, Apple would be foolish to abandon it. The only things that seem safe bets to change are that RAM and graphics won't be upgradeable without replacing the SoC. But, if Apple sockets/slots that SoC and allows for such aftermarket upgrades, that'll be a trade-off considered to be acceptable by ENOUGH of that leftover market.

The negligible group of users who remain don't need to be serviced by Apple. Not if serving them is counter to the direction of the Mac itself, which it is.

I don't fully disagree here. But I don't really agree either. High end audio (where latencies make or break) still seems like something where the Mac platform completely dominates over Windows and Linux. Do those require more than an Ultra version Mac Studio with 128GB of RAM? I'm not an audio guy, so I can't say. But I'd imagine that there are use cases wherein even that kind of Mac Studio would fall short. Not many, but enough.

I do think that between 2010 and 2019, Apple did lose enough high end customers such that Windows in the high-end isn't as outlandish of an idea as it might've been in 2010.

But, I also think that Apple is so scared of losing people in any area of their walled off ecosystem that they want to make sure that, for example, someone doesn't end up having to use a Xeon-based Windows workstation and realize that maybe Windows isn't so bad (causing that same customer to not buy a MacBook Pro or MacBook Air for when they are not at that Xeon workstation).


It doesn't seem unfeasible to me that Apple could simply write AMD drivers for ASi based Macs.

Writing the drivers isn't the issue. The issue is that the platform isn't designed for those devices and for those drivers.

That's all that's really missing from those Macs at any level.

No. You're thinking of Apple Silicon macOS and Intel macOS as the same in this regard. They are not at all. x86/x86-64 versions of macOS accommodated those devices and those drivers. Apple Silicon versions of macOS don't work that way at all.

Doing so would allow for eGPUs, and allow for AMD GPUs in future Mac pros.

It wouldn't. The hardware architecture is not built in the same way that Intel Macs as a hardware architecture were. Drivers are not the only missing element here.

It allows them to retain their SOC architecture and so on, while allowing users to upgrade their GPUs or to have multiple GPUs should their workflow support that need.

First off, the graphics subsystems on Apple Silicon do not function in such a way that a traditional GPU would be able to act as an assist. Secondly, even if it was, the graphics card would be so much slower in terms of interconnection to the rest of the SoC that you wouldn't be helping all that much with it. Apple's GPU cores are always going to be faster given that they are literally on the same die as the rest of the system. Thirdly, Apple GPUs and traditional GPUs don't operate the same way at all. Getting one to work in the same way as the other (a) would be impossible and (b) would negate any performance benefits, even if possible.

At best, you might have an expanded Afterburner product line wherein there are added compute modules for given workflows. But you're not going to see traditional GPU workloads.

On the RAM front, I'm not sure. Maybe it is doomed to have SOC only RAM. Maybe not.

Unless Apple drastically does a 180 from how they designed Apple Silicon as a Mac hardware platform, it's a safe bet that RAM will be tied to the SoC.

We can read the rumor mill to death, but ultimately we won't know what to expect until Apple releases their first Mac Pro and we see it ourselves.

No, it's not always this nebulous "we won't know anything until Apple announces anything". In this case Apple HAS revealed how developers should be making their apps. For them to change this drastically from what they've done thus far would require developers to do massive changes in how they write their apps. Considering how slow app developers have been to port over apps to being Apple Silicon native to begin with, this would be such an obviously huge mistake for Apple. You can take to the bank that third party GPUs are still off the table and that RAM will still be unified and tied to the SoC.

It's entirely possible Apple will depart from the SOC style and break it down into modular components specifically for the modular Mac.

It's about as close to being impossible as you can get without it actually being impossible.

That would support their cancellation of the M2 Extreme chip, because they realized that when you get to the high end market, your needs for more RAM, CPU cores, GPU power, etc don't scale linearly with each other. Sometimes you're a pro that needs 5x GPU power and standard CPU power, and sometimes the other way around. Why pay for systems you don't need, until you need it, when you're talking about the high end modular sector?

Again, a socketed/slotted SoC would solve all of those things already. As it stands, each Apple Silicon SoC already has multiple GPU (and, in the case of the M1 Pro and M2 Pro, CPU) core and RAM combinations. You can buy an M1 Ultra with 128GB of RAM, but only its base GPU or the opposite, or entirely base model or entirely maxed out. That's four possible M1 Ultra chip configurations. The M2 Max now has five different possible configurations. Socketed/slotted SoCs would be how that problem gets solved. Not as much flexibility as you have in the Intel era. But it's been that way since the very first M1.

I'm pretty sure Apple knows all of that, and the struggle to build something that can check the modular requirement box is probably why the Mac Pro has been the last in the lineup to be updated. It's not that they can't just interconnect more chips, it's that interconnecting more chips doesn't always make sense.

Right. Doubling M1 Ultra or M2 Ultra wouldn't make sense. You would get the right CPU core count, but you wouldn't get the right RAM support. The Mac Studio was very careful to provide the same exact same maximum RAM capacity that the 2020 27-inch iMac had. Apple needs to at least get close to being able to offer the same 1.5TB maximum capacity offered by the current Mac Pro (or, at the very very least, the 768GB maximum RAM that you can get from a 16-core model). I'd imagine that the graphics options also need to be beefier than that of the M1 Ultra's highest-end 64-GPU core configuration or even the 76-GPU core configuration of an M2 Ultra that doubled the M2 Max's highest end GPU core configuration. Then again, it's also totally possible that you have a Mac Pro customer that is fine with the performance of an M2 Max and just needs PCIe slots. I'd imagine Apple will give us options when configuring such a machine.

From what I understand, any GPU drivers would need to deal with UEFI hooks for boot up when Apple silicon Macs don't have UEFI at all. I'm not sure how that would get resolved. Maybe it doesn't matter since the default boot display could be the internal GPU.

That's not even half of the obstacles. macOS on Apple Silicon is architected to not support third party GPUs at a very fundamental level. It's why you're not hooking up an eGPU to your iPad Pro with Thunderbolt.

They don't. According to Asahi Linux developers, in theory it's perfectly possible to make eGPU drivers for Apple Silicon – by third parties, even. The only issue is that memory mapping is extremely hard even for seasoned developers.
I would imagine that, given how outright Apple has been on this topic, that there's a whole lot more stopping it from happening than that. It's my understanding that even if you connected the GPU to the Apple Silicon Mac, the Apple Silicon Mac isn't going to know what to do with it and well beyond it simply not having a driver.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
They don't. According to Asahi Linux developers, in theory it's perfectly possible to make eGPU drivers for Apple Silicon – by third parties, even. The only issue is that memory mapping is extremely hard even for seasoned developers.

Wasn't part the problem that Apple doesn't support some types of device memory mapping that a GPU driver would need?

That's not even half of the obstacles. macOS on Apple Silicon is architected to not support third party GPUs at a very fundamental level. It's why you're not hooking up an eGPU to your iPad Pro with Thunderbolt.

I don't understand this statement. An eGPU is just a PCI device. Why would GPU devices be unsupported at a fundamental level? If you can write a PCI driver, you can write a GPU driver, no?
 

XboxEvolved

macrumors 6502a
Aug 22, 2004
870
1,118
I miss the good ole days when you could buy a PowerMac G3 or G4 for like $1600 (the equivalent of about $2700 today) vs the insane $6k starting price now, and overtime you could just upgrade pretty much everything in it but maybe the logic board over time and they seemed to last for a decent amount of time too. My Digital Audio G4 to this day holds the record for the longest I've ever used a single computer of like 8 years (granted it was actually pretty slow towards the end).
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
I miss the good ole days when you could buy a PowerMac G3 or G4 for like $1600 (the equivalent of about $2700 today) vs the insane $6k starting price now, and overtime you could just upgrade pretty much everything in it but maybe the logic board over time and they seemed to last for a decent amount of time too. My Digital Audio G4 to this day holds the record for the longest I've ever used a single computer of like 8 years (granted it was actually pretty slow towards the end).
May be more expensive to be 100% recyclable.
 

120FPS

macrumors regular
Oct 26, 2022
174
206
I feel like Mac Pro users who have those requirements have generally shifted away to Windows, only the die-hards remain that may be tied to something like Apple's editing software.

The price of those components was just too high, which resulted in the absurd cost of the Mac Pro. Besides general computing has caught up to be a similar price to early cheese grater Mac Pro pricing and responsive in smaller and cheaper devices. You no longer need a Mac Pro to run adobe software at good performance.

Apple could offer a great Mac Pro if they wanted to but clearly they are more interested in maintaining the locked down soldered approach so they can charge more at the time of purchase which makes sense for RAM but not storage.
 

Confused-User

macrumors 6502a
Oct 14, 2014
852
987
Some of your points are reasonable guesses, but to claim repeatedly that they are facts is utter foolishness.

The notion that Apple can't create an AS SoC that supports expandable memory is laughable. Quoting their video from 2020 is equally laughable. It's what they said was true then. It's not a promise for the future, and even if it were it isn't binding. It's true that making memory expandable would present a number of interesting engineering challenges, primarily around latency, and (depending on the chosen solution) possibly the complication of a tiered memory system.

Now, ask me if I think they will actually implement expandable memory? I dunno. Probably less than 50-50 odds, because I think they think the market's probably not there to support the effort. But I don't pretend to know for sure, and no matter how much confidence you exude you don't know either.

As for GPUs... that's a whole lot of handwavy nonsense about "it's not just the drivers". Sure there's likely to be some other integration work that needs to be done, but it's 95% drivers. If Apple decided that they wanted to allow AMD (or even nVidia!) cards to be used, they would do that. There's very little in the way of engineering challenges here, comparatively speaking. There are consequences - giving up some level of certainty about homogeneity of the environment, which potentially adds a modest amount of complexity to some code down the road - but until all Intel Macs are desupported, you mostly have that anyway.

Again, this is different from the question "will they actually support other GPUs?", and again I don't know. It seems somewhat more likely to me than expandable memory, but I'm just guessing. As are you.

If I were really going to think hard about this, though, I'd be asking some different questions that the ones I've seen discussed endlessly here. Apple is generally VERY smart when it comes to their chip design. And they have a LOT of different products - no, not products, rather entire markets - to design for. If you really want to polish your crystal ball and ask questions, think about this: What kinds of products might be coming in the next 5-10 years, and what will they need to do, and where does Apple need to be pathfinding and building knowledge right now? And, where can that synergize with current product needs?

Some of the answers to those questions are no so hard to figure out, while others are vastly more challenging. Start by thinking about what sorts of hardware they'll need for massive high-end VR, AR, and AI applications.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
I miss the good ole days when you could buy a PowerMac G3 or G4 for like $1600 (the equivalent of about $2700 today) vs the insane $6k starting price now, and overtime you could just upgrade pretty much everything in it but maybe the logic board over time and they seemed to last for a decent amount of time too. My Digital Audio G4 to this day holds the record for the longest I've ever used a single computer of like 8 years (granted it was actually pretty slow towards the end).

You are talking about times when pretty much the fastest GPU on the market (ATI 9800 Pro) was $400. Look how much an RTX 4090 costs today :)
 
  • Like
Reactions: AdamBuker

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Some of your points are reasonable guesses, but to claim repeatedly that they are facts is utter foolishness.

The notion that Apple can't create an AS SoC that supports expandable memory is laughable. Quoting their video from 2020 is equally laughable. It's what they said was true then. It's not a promise for the future, and even if it were it isn't binding. It's true that making memory expandable would present a number of interesting engineering challenges, primarily around latency, and (depending on the chosen solution) possibly the complication of a tiered memory system.

Now, ask me if I think they will actually implement expandable memory? I dunno. Probably less than 50-50 odds, because I think they think the market's probably not there to support the effort. But I don't pretend to know for sure, and no matter how much confidence you exude you don't know either.

As for GPUs... that's a whole lot of handwavy nonsense about "it's not just the drivers". Sure there's likely to be some other integration work that needs to be done, but it's 95% drivers. If Apple decided that they wanted to allow AMD (or even nVidia!) cards to be used, they would do that. There's very little in the way of engineering challenges here, comparatively speaking. There are consequences - giving up some level of certainty about homogeneity of the environment, which potentially adds a modest amount of complexity to some code down the road - but until all Intel Macs are desupported, you mostly have that anyway.

Again, this is different from the question "will they actually support other GPUs?", and again I don't know. It seems somewhat more likely to me than expandable memory, but I'm just guessing. As are you.

If I were really going to think hard about this, though, I'd be asking some different questions that the ones I've seen discussed endlessly here. Apple is generally VERY smart when it comes to their chip design. And they have a LOT of different products - no, not products, rather entire markets - to design for. If you really want to polish your crystal ball and ask questions, think about this: What kinds of products might be coming in the next 5-10 years, and what will they need to do, and where does Apple need to be pathfinding and building knowledge right now? And, where can that synergize with current product needs?

Some of the answers to those questions are no so hard to figure out, while others are vastly more challenging. Start by thinking about what sorts of hardware they'll need for massive high-end VR, AR, and AI applications.

What an excellent writeup! Thanks for bringing some of the much needed common sense to this very confused thread.
 

XboxEvolved

macrumors 6502a
Aug 22, 2004
870
1,118
You are talking about times when pretty much the fastest GPU on the market (ATI 9800 Pro) was $400. Look how much an RTX 4090 costs today :)
I had a GeForce 3 that originally went for $500 but I think I ended up paying $150 for it. Yeah, I don't really comprehend graphics card prices anymore.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
Wasn't part the problem that Apple doesn't support some types of device memory mapping that a GPU driver would need?

Here's a post from someone at Hackernews who seems to understand the issue:

I'm skipping over a few details here[0].
ARM has several different classes of memory access mappings; the normal one is called... well, Normal. I/O and storage is mapped using Device mappings, which don't cache reads and can be further divided into how little the CPU is allowed to try and optimize writes[1]. GPUs need Normal memory because applications written for modern graphics APIs expect to be able to map GPU memory into themselves and read and write to it like CPU memory.

The ARM spec for I/O is that you are always allowed to use whatever mapping type the device needs, and that less-strict mappings should, in the worst case, "fall back" to stricter ones. Apple handles this differently; the SoC fabric requires you use the specific device mapping that it expects for a particular device, and if you try to use something looser or stricter than what it wants, it will drop the transaction and raise an exception. And of course the SoC fabric will not allow Normal memory reads or writes to hit a PCIe device.

As far as the Asahi Linux team is aware, there isn't a way from the CPU to turn off this behavior. It's also not the only implementation of PCIe on ARM that locks out PCIe memory. Raspberry Pi 4's PCIe support[3] also has the same design flaw. If it was just a driver problem, someone would have ported AMDGPU to ARM and ran it on Asahi Linux by now, and we'd be posting cool benchmarks between the internal and external GPUs.

You don't notice this problem for I/O or storage because those never need to be mapped as Normal.

[0] And probably still skipping over more details, since I'm not an ARM expert. This is just what I've gleaned from reading other kernel developers' Mastodon and Twitter feeds.

[1] Which, BTW, the M1 also screws up. You're supposed to be able to pick posted writes[2] or non-posted writes; Apple Silicon specifically refuses transaction types that don't match what the hardware expects.

See https://lore.kernel.org/linux-arm-kernel/20210120132717.3958... for more info.

[2] Write instructions finish before the write is actually sent to the device.

[3] Yes, it has PCIe. One lane of it, used to drive an external USB 3 controller. You can of course repurpose it for other things.
 
  • Like
Reactions: jdb8167
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.