Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Right, but how many Mac Pro customers (whose workloads weren't already better suited by a Windows or Linux workstation with NVIDIA cards) NEEDED 2 AMD GPUs?

I'm not saying and never did say that the loss of being able to stuff in GPUs after the fact isn't a bummer. But I will completely challenge the notion that a computer NEEDS this ability in order to be considered a proper workstation. Especially since I'm sure that there were plenty of 2019 Mac Pro customers that only ever had one GPU in their Mac Pro (in which case M2 Ultra's GPU would outperform it).
Wow playing all the trashcan apology oldies... How does that line go again...because I don't need or want it, I want you to never be able to get it...

Next on the play list, how many people really need more than 640k.... 🙄
 
If GPU expansion is the only reason you buy a Mac Pro, then why are you buying a Mac Pro to begin with? I'm not saying the loss of aftermarket upgrades isn't a bummer. But the Mac platform in general has always been hostile in this department. You're lamenting the loss of GPU expansion and you never even got to put in the best PCIe GPUs out there to begin with.

It’s an older post, but I just want to point out that this idea the Mac has been hostile to GPU upgrades is a myth, and largely a rewriting of history.

With the exception of the window from 2013 Mac Pro to the introduction of TB3 & eGPU (and even during that time there was official support for 2012 machines with rx580s), Apple has always supported user-upgrades for graphics on Professional level desktops. That support has narrowed at times, expanded at others, but the company has never been “hostile” to it until AS, where the decision is for the Mac to be a UI skin for bigger iPads.
 
Yes. And the results tell us that barring extremely high end multi-GPU configurations, it's faster across the board.

This isn't at all accurate. The M2 Ultra does not even match an RTX 2080 Ti from 2018 in graphics performance let alone a RTX 3090 or RTX 4090 in compute.

These are all single-GPU cards, not multi-GPU. Take a look at the OpenCL benchmark alone comparing the M2 Ultra and some contemporary graphics cards:

99e84a5c71de747dce41e353fb991dd5.png

The NVIDIA RTX 4060 Ti even beats it which has been panned by reviewers as a waste of sand. And for anyone looking for professional use an RTX 4090 on its own is 2.5x faster.
 
It’s an older post, but I just want to point out that this idea the Mac has been hostile to GPU upgrades is a myth, and largely a rewriting of history.

With the exception of the window from 2013 Mac Pro to the introduction of TB3 & eGPU (and even during that time there was official support for 2012 machines with rx580s), Apple has always supported user-upgrades for graphics on Professional level desktops. That support has narrowed at times, expanded at others, but the company has never been “hostile” to it until AS, where the decision is for the Mac to be a UI skin for bigger iPads.

NVIDIA claims that the reason they didn't bring 20 series GPU's to the Mac is because Apple refused to sign their drivers.

Is that not hostile? - This was in 2018 by the way. You can read all about it here which includes NVIDIA's public statement on the matter: https://appleinsider.com/articles/1...in-macos-and-thats-a-bad-sign-for-the-mac-pro
 
NVIDIA claims that the reason they didn't bring 20 series GPU's to the Mac is because Apple refused to sign their drivers.

Is that not hostile? - This was in 2018 by the way. You can read all about it here which includes NVIDIA's public statement on the matter: https://appleinsider.com/articles/1...in-macos-and-thats-a-bad-sign-for-the-mac-pro

Hostile to Nvidia, not hostile to GPUs. I don’t agree with Apple‘s actions on this, but the justification from their end was that metrics was showing them that Nvidia’s drivers were disproportionally causal for kernel panics, and Nvidia wasn’t willing to either fix them, or invest in Metal.

AMD was, so AMD got the contract and the support.

In the past, Nvidia was Apple’s GPU supplier of choice after AMD / ATI spoiled the release of the G4 cube, Apple literally pulled cards from manufactured machines, and replaced them with Nvidia options. Prior to that, ATI had the contract.

And back before that, we used to use GPUs by Radius, even when the machine contained a carded GPU from Apple.

History is deeper than 2018.
 
Hostile to Nvidia, not hostile to GPUs. I don’t agree with Apple‘s actions on this, but the justification from their end was that metrics was showing them that Nvidia’s drivers were disproportionally causal for kernel panics, and Nvidia wasn’t willing to either fix them, or invest in Metal.

AMD was, so AMD got the contract and the support.

In the past, Nvidia was Apple’s GPU supplier of choice after AMD / ATI spoiled the release of the G4 cube, Apple literally pulled cards from manufactured machines, and replaced them with Nvidia options. Prior to that, ATI had the contract.

And back before that, we used to use GPUs by Radius, even when the machine contained a carded GPU from Apple.

History is deeper than 2018.

Please show a link that backs up the claim that NVIDIA's drivers were disproportionally casual for kernel panics because this literally the first I'm hearing about it on macOS.

The article I linked to has sources within Apple and they say it was all political. Furthermore, Metal is a software side API that is largely similar to Vulkan which runs just fine on NVIDIA GPU's. Apple could have put in the effort to make it work on NVIDIA cards but chose not to, again for political reasons.

NVIDIA and Apple had turmoil ever since the 2008 laptop GPU failures and NVIDIA's unwillingness to take culpability.

When it comes to AMD card compatibility, AMD has never released drivers for their own GPU's on Mac. There is no driver package you can go to their website download and install. You've always been stuck with whatever drivers Apple provided with their OS releases. This has meant many of the latest and greatest AMD cards did not work on Mac.

Similarly, there are only two real dedicated GPU makers in town, AMD and NVIDIA (not including Intel which just started). Apple chose not to sign NVIDIA's drivers instead of workout whatever differences they had and that is GPU hostile 100%. You may disagree but that is hostile when NVIDIA is willing to release drivers and that worked flawlessly on previous macOS versions and then Apple just ignores them etc it's hostile.
 
Please show a link that backs up the claim that NVIDIA's drivers were disproportionally casual for kernel panics because this literally the first I'm hearing about it on macOS.

The article I linked to has sources within Apple and they say it was all political. Furthermore, Metal is a software side API that is largely similar to Vulkan which runs just fine on NVIDIA GPU's. Apple could have put in the effort to make it work on NVIDIA cards but chose not to, again for political reasons.

NVIDIA and Apple had turmoil ever since the 2008 laptop GPU failures and NVIDIA's unwillingness to take culpability.

When it comes to AMD card compatibility, AMD has never released drivers for their own GPU's on Mac. There is no driver package you can go to their website download and install. You've always been stuck with whatever drivers Apple provided with their OS releases. This has meant many of the latest and greatest AMD cards did not work on Mac.

Similarly, there are only two real dedicated GPU makers in town, AMD and NVIDIA (not including Intel which just started). Apple chose not to sign NVIDIA's drivers instead of workout whatever differences they had and that is GPU hostile 100%. You may disagree but that is hostile when NVIDIA is willing to release drivers and that worked flawlessly on previous macOS versions and then Apple just ignores them etc it's hostile.
Are those the same informative articles that praised the trashcan before apple did an apology tour, or the ones recently touting how great the M2 ultra mac pro is while failing to mention a lowly laptop i9 spanks the m2ultra, a 6900xt spanks it in apple’s own metal scores, and fail to mention that the 6 slots have to share 16pci lanes that get used up by a single card?

You know with genius “brave” articles like that to back up positions, not sure how reality doesn’t just go and rewrite itself to fit all the sycophantic mac press propaganda. /sarcasm
 
Are those the same informative articles that praised the trashcan before apple did an apology tour, or the ones recently touting how great the M2 ultra mac pro is while failing to mention a lowly laptop i9 spanks the m2ultra, a 6900xt spanks it in apple’s own metal scores, and fail to mention that the 6 slots have to share 16pci lanes that get used up by a single card?

You know with genius “brave” articles like that to back up positions, not sure how reality doesn’t just go and rewrite itself to fit all the sycophantic mac press propaganda. /sarcasm

The reality is, NVIDIA says Apple refused to sign their drivers and Apple did not refute that claim and all the sources spoken to by the press also align with NVIDIA's assertion that Apple specifically chose not to sign their drivers on purpose.

If that's not anti-GPU then I don't know what is. If people want to live in the reality distortion field that's their business. When it comes to the Apple Silicon based Macs not supporting separate GPU's I understand they never architected the chips to do that, that's a whole other situation unrelated to them not signing drivers that could have been used on older Intel-based Macs etc
 
  • Like
Reactions: Romain_H
The reality is, NVIDIA says Apple refused to sign their drivers and Apple did not refute that claim and all the sources spoken to by the press also align with NVIDIA's assertion that Apple specifically chose not to sign their drivers on purpose.

If that's not anti-GPU then I don't know what is. If people want to live in the reality distortion field that's their business. When it comes to the Apple Silicon based Macs not supporting separate GPU's I understand they never architected the chips to do that, that's a whole other situation unrelated to them not signing drivers that could have been used on older Intel-based Macs etc

I think you have a very fair point. I also think @mattspace has a fair telling of history. I also think good and fair minded folks could disagree On the point you’re both debating.

My only point on your request for citation from a l cowardly bankrupt Mac press is not persuasive to at least some others.
 
I think you have a very fair point. I also think @mattspace has a fair telling of history. I also think good and fair minded folks could disagree On the point you’re both debating.

My only point on your request for citation from a l cowardly bankrupt Mac press is not persuasive to at least some others.
I'm asking him for any link of any kind that backs up what he said. From Apple, from a Mac rag, from even forum posts, anything.

Because right now he's the only source. We shouldn't forget that Apple shipped plenty of computers with NVIDIA GPU's. I had several and never had problems with them crashing. I don't think it's so much that when people make a claim that they should be able to provide a source for said claim.
 
I'm asking him for any link of any kind that backs up what he said. From Apple, from a Mac rag, from even forum posts, anything.

Because right now he's the only source. We shouldn't forget that Apple shipped plenty of computers with NVIDIA GPU's. I had several and never had problems with them crashing. I don't think it's so much that when people make a claim that they should be able to provide a source for said claim.
I do remember a spate of nvidia gpu problems. I don’t remember on what machine. Then again I also remember a spate of AMD problems on both MacBook pros and iMacs (but if I recall that was more overheating). This probably was around 10 years ago with both.

I also vaguely recall some of his points bandied about in these forums before but of course cant recall the context or thread off hand. Something to do with a lot of consternation of “do I use the apple supplied Nvidia drivers or the Nvidia released version“ causing different issues for users. Can’t quite recall the rest of the context but perhaps others can chime in.
 
I do remember a spate of nvidia gpu problems. I don’t remember on what machine. Then again I also remember a spate of AMD problems on both MacBook pros and iMacs (but if I recall that was more overheating). This probably was around 10 years ago with both.

I also vaguely recall some of his points bandied about in these forums before but of course cant recall the context or thread off hand. Something to do with a lot of consternation of “do I use the apple supplied Nvidia drivers or the Nvidia released version“ causing different issues for users. Can’t quite recall the rest of the context but perhaps others can chime in.
A lot of the failures when it comes to GPU's which started around 2008 are due to the removal of lead from the solder balls that go under the GPU's. It affected both AMD and NVIDIA but NVIDIA was in most if not all the Apple laptops at the time the problem began.

When they used tin-based solder to meet governmental health regulations about lead in the environment it cracks under heat cycles and breaks the joints so eventually the GPU's fail. NVIDIA refused to accept culpability and Apple switched to using AMD exclusively after that. But AMD had the same issues with the solder it just wasn't as prominent due to NVIDIA's market share amongst laptop makers at the time.

But that was a decade before the driver situation. At the time that happened in 2018 Apple hadn't sold a Mac with a physical PCIe slot for 5 years but there were many diehard fans of the 2006-2013 Mac Pro (cheese grater) that were still buying and upgrading them with modern GPU's.

Anyway this has gone on a long time this one particular subject. Apple is gonna Apple, if we don't like it, don't buy it. I wouldn't buy a Mac Pro, doesn't offer what I need in a desktop computer but I think the MacBook Pro is a great product so I buy those instead.
 
  • Like
Reactions: Basic75
Next on the play list, how many people really need more than 640k....
It amuses me how that has turned into a meme that people think is ironic because we're now arguing about whether 192,000,000k is enough. It was originally a reference to the IBM PC only supporting 640k of RAM whereas the 8086 could support a whole megabyte of RAM. It wasn't even that stupid a decision in 1981 when 64k was a lot of RAM - no more "stupid" than the original Mac only supporting 128k - and only became an issue because IBM managed to lock the PC world into an already obsolete late-70s PC design with a stop-gap processor lacking up-to-date memory management. It really became irrelevant as soon as PCs were able to use protected mode (or, better, just not use x86 - like the Mac).

How does that line go again...because I don't need or want it, I want you to never be able to get it...
I don't need or want a unicorn - but you're welcome to have one provided you can find someone to create one for you.

Good luck with that.

The reason you can't have an Apple Silicon Mac with 1TB of RAM and 8 full-bandwidth PCIe slots with NVIDIA 4090 support (which not even the 2019 Mac Pro had - some slots still had to share) is that Apple Silicon doesn't do that.

Apple Silicon is hugely successful at supporting useful and distinctive products from the iPad and Vision Pro up to the Mac Studio Ultra with only two basic die designs that themselves are developments of mass-market iPhone core designs - and that's where Apple makes most of their (non-iPhone) money. Those applications take advantage of the all-in-one SoC, the efficiency of unified RAM and the high performance by integrated GPU standards of the integrated GPU and absolutely slays anything that takes advantage of the media/neural engines. It is not designed to compete with Xeon-W - let alone Threadripper. Even the hypothetical M2 Extreme doesn't close that gap - and the mythical M3 isn't going to improve things by the required order of magnitude.

Apple could sink a billion or so into developing a new die with ARM ISA, 128 PCIe 5 lanes and support for 2TB of external ECC RAM, maybe they could even bury the hatchet with NVIDIA (who would be delighted to help Apple become a competitor in the ARM heavy metal market) so you could plug in a pair of 4090s... and they'd have... well, a Me Too workstation tower that performed about as well as any other system with dual 4090s and 2TB of ECC DDR5 (if NVIDIA did a good job of writing the drivers). Of course, it would have nice, cool-running, low-power ARM processor which you'd barely notice alongside the 1kW being kicked out of those GPUs. That's assuming that everybody was fine with not being able to run x86 code. They'd then have to recoup those development costs from a small and shrinking pool of people who need that level of power and can't change their MacOS-based workflows to take advantage of the far wider choice of workstation-class PC hardware that is already out there or just rent what they need in the cloud. It would cost a fortune - bear in mind that Intel can get away with charging $8000 for just the CPU that the higher-end 2019 Mac Pro uses, and they have a far larger market over which to spread their development costs.

...so, basically, it's simply not in Apple's economic interest to sell you your unicorn. Instead, they're offering you a pony with a pointy seashell glued to its head - AKA the 2023 Mac Pro - which I agree is a bit of a silly charade. There's definitely a niche for a Mac Studio Ultra with PCIe slots - and what they're offering is a considerable improvement over a PCIe enclosure (16 PCIe 4 lanes shared with 64 lanes worth of slots c.f. typically 4 PCIe 3 lanes shared between 32 lanes-worth of slots, plus less clutter) - and I'm sure the development costs were a fraction of what it would cost to design a new die just for the MP. It still seems ridiculously expensive - but economies of scale are a big thing with electronics and I don't see these flying out of the shops (plus, they've kept the grotesquely over-engineered 2019 case). OTOH, some people would pay $7000 for a handbag, so...
 
  • Like
Reactions: AdamBuker and wgr73
(and even during that time there was official support for 2012 machines with rx580s)

is that there was deliberate support from apple for RX580 card or simply a consequence of iMacs and MBP using AMD GPUs so drivers that worked with them HAD to be part of Mac OS.

10.12.6 added a driver that supported the RX580 which also coincides with the iMac 2017 which has an R580 with 8Gb VRAM for instance.

the r580 and rx580 essentially the same GPU just that one sold on GPU cards and one used in iMac.

there were also eGPU solutions with RX580 supported so Mac OS has to have support and 2019 launched with an RX580 module.

so was this a deliberate choice to support these GPU in a 5,1 ? or like windows boot on Intel just a consequence of an apple choice to use Intel cpu, that using similar GPU in other Macs so driver support in Mac OS present already.

I ended up with a sapphire pulse rx580 as was found to work best in my hack. There was no actual official rx580 for Mac Edition card like had been with the EVGA 680 Mac and sapphire 7950 mac that apple had to bring out for Metal Support on the Mac Pro.

yes am aware that the card appears on supported list for 10.13.6 in a Mac Pro 5,1 that along with the MSI Radeon 560 gaming were the only non-Mac cards listed. Listed other cards on generic GPU along with note of should work but may not. If doesn’t work pick one specifically listed. The r560 used in iMac 2017 in 21” model so would have a driver present. As such the GPU on them already added to Mac OS.
 
  • Like
Reactions: Quu
The reality is, NVIDIA says Apple refused to sign their drivers and Apple did not refute that claim and all the sources spoken to by the press also align with NVIDIA's assertion that Apple specifically chose not to sign their drivers on purpose.

If that's not anti-GPU then I don't know what is. If people want to live in the reality distortion field that's their business. When it comes to the Apple Silicon based Macs not supporting separate GPU's I understand they never architected the chips to do that, that's a whole other situation unrelated to them not signing drivers that could have been used on older Intel-based Macs etc
I did some research on “apple nvidiagate“ and was the 8600m that was the problem. Affected more than Apple as HP and Dell also sued around it.
Nvidia apparently paid out $10m dollars to dell for supporting legal costs and left apple out in cold and would not accept any responsibility.

this was the beginning of the problem.

yes there was radeongate afterwards and mbp went back to Nvidia for a while so clearly relationship not broken.

from what can find then apple wanting to move people to metal API whereas Nvidia wanted to keep CUDA support in its drivers. Whereas apple wanted metal to be the one developers used So had impasse.

Apple then pulled the ability of Nvidia to get drivers signed with Mojave which froze Nvidia out leaving no CUDA on Mac and leaving Metal API on Mac OS unchallenged.

nvidia went public that apple not signing drivers, they have them but apple won’t sign them.

I suspect that if Nvidia dropped CUDA from its Mac drivers then Apple continued to support signing them. Though by then the market for add on cards dwindling and Apple probably though stuff it not going to lose much.

probabky saw as well with the 7950/680 mac editions that many people just bought pc versions and flashed. (Guilty as charge) so not going to sell many cards. Where is Apples money in that. At that point only people with Mac Pros would be market.

also likely why MPX on 2019 to make more difficult to use PC Cards.

so not so much anti Nvidia or anti GPU but pro metal and don’t want CUDA about. Apple way or the highway. Definitely Apple refusing sign Drivers but is important why they didn’t sign the drivers.

not saying right or wrong but if your customer (Apple) says Metal not CUDA and Nvidia went no then like people that want CUDA and don’t buy Apple the customer went elsewhere in this case AMD.

if people are interested then there was also the Microsoft Xbox / Nvidia spat where Microsoft not happy with price paid for the Nvidia Chips in Xbox not dropping over time (how apple like nvidia not dropping price over time) and froze Nvidia out of Xbox360.

Sony also had spat with Nvidia with PS3. Originally no seperate GPU but use Cell for Graphics but ran into difficulty for developers working with Cell.
so needed GPU last minute and put a customer 7800+ GPU in but high price. Sony wanted to revisit and Nvidia was tough that is the price signed on the contract.

so no Nvidia on Xbox or PlayStation since and now that SoC and Microsoft want codebase the same for Xbox/windows then x86 SoC all the way now.

So Nvidia do have history of upsetting partners as well so not as one sided as may initially appear. Though certainly neither side blameless here.

whilst Nvidia insists on CUDA then cannot see getting back in.

Apple also I believe use tile base rendering where works out which tiles visible and only renders those whereas AMD and Nvidia render everything and then displays. believe that is also another diff on Apple Silicon for AMD or Nvidia to get back in.

hopefully not apple koolaid for you.
 
is that there was deliberate support from apple for RX580 card or simply a consequence of iMacs and MBP using AMD GPUs so drivers that worked with them HAD to be part of Mac OS.

No, Apple produced a specific support document about which rx580s were supported in the 5,1, detailing as well that users installing them would have to forego FileVault and boot screens.

It was pretty surprising at the time, because Apple will almost never suggest a significantly degraded user experience as a part of a solution.
 
I did some research on “apple nvidiagate“ and was the 8600m that was the problem. Affected more than Apple as HP and Dell also sued around it.
Nvidia apparently paid out $10m dollars to dell for supporting legal costs and left apple out in cold and would not accept any responsibility.

this was the beginning of the problem.

yes there was radeongate afterwards and mbp went back to Nvidia for a while so clearly relationship not broken.

from what can find then apple wanting to move people to metal API whereas Nvidia wanted to keep CUDA support in its drivers. Whereas apple wanted metal to be the one developers used So had impasse.

Apple then pulled the ability of Nvidia to get drivers signed with Mojave which froze Nvidia out leaving no CUDA on Mac and leaving Metal API on Mac OS unchallenged.

nvidia went public that apple not signing drivers, they have them but apple won’t sign them.

I suspect that if Nvidia dropped CUDA from its Mac drivers then Apple continued to support signing them. Though by then the market for add on cards dwindling and Apple probably though stuff it not going to lose much.

probabky saw as well with the 7950/680 mac editions that many people just bought pc versions and flashed. (Guilty as charge) so not going to sell many cards. Where is Apples money in that. At that point only people with Mac Pros would be market.

also likely why MPX on 2019 to make more difficult to use PC Cards.

so not so much anti Nvidia or anti GPU but pro metal and don’t want CUDA about. Apple way or the highway. Definitely Apple refusing sign Drivers but is important why they didn’t sign the drivers.

not saying right or wrong but if your customer (Apple) says Metal not CUDA and Nvidia went no then like people that want CUDA and don’t buy Apple the customer went elsewhere in this case AMD.

if people are interested then there was also the Microsoft Xbox / Nvidia spat where Microsoft not happy with price paid for the Nvidia Chips in Xbox not dropping over time (how apple like nvidia not dropping price over time) and froze Nvidia out of Xbox360.

Sony also had spat with Nvidia with PS3. Originally no seperate GPU but use Cell for Graphics but ran into difficulty for developers working with Cell.
so needed GPU last minute and put a customer 7800+ GPU in but high price. Sony wanted to revisit and Nvidia was tough that is the price signed on the contract.

so no Nvidia on Xbox or PlayStation since and now that SoC and Microsoft want codebase the same for Xbox/windows then x86 SoC all the way now.

So Nvidia do have history of upsetting partners as well so not as one sided as may initially appear. Though certainly neither side blameless here.

whilst Nvidia insists on CUDA then cannot see getting back in.

Apple also I believe use tile base rendering where works out which tiles visible and only renders those whereas AMD and Nvidia render everything and then displays. believe that is also another diff on Apple Silicon for AMD or Nvidia to get back in.

hopefully not apple koolaid for you.
This is all factual and I agree. However, NVIDIA also uses a tile-based rendering system since 2017. Also, it's important to note the GTX 680 and newer cards did support Metal v1. As like I mentioned in a previous post, it's just an abstraction layer for shader calls. No different to what DX12 and Vulkan do.

All of these layers are just abstraction layers to the hardware and thus maintained by the vendor that produces them. Be that Microsoft with DX12 or Apple with Metal. The only thing that the card manufacturers need to do is make sure their cards have the features exposed that these abstraction layers need. Be that certain shader support, tessellation accelerators, raytracing accelerators etc

In the case of Metal 2 and 3 that would likely include the need to do tile-based rendering but again NVIDIA already supports that since 2017 and it's their default method of shading pixels at an architecture level.

Realistically if Apple wanted they could get Metal 2 and probably Metal 3 working on current generation NVIDIA GPU's inside of a few days if they don't already have that done for a just-in-case scenario.

I do fully agree with you that Apple did not want CUDA to become the prominent computing architecture. They wanted OpenCL to win and later Metal. But nothing really stopped them from doing either of those on NVIDIA cards and like it or not NVIDIA has become the de facto standard. Everyone I know that does machine learning including myself would only consider NVIDIA GPU's at this point, the other hardware isn't capable enough and to be honest, CUDA from a software standpoint is leagues ahead, it's so easy to integrate and more importantly debug with the IDE you're already used to.

I just don't think Apple cares about that niche and well why should they? they're a consumer electronics company these fringe use cases are a drop in the ocean for their revenues. I'm not at all complaining because I'm really happy with what other companies (like NVIDIA) are offering for my work, I have no complaints etc
 
If that's not anti-GPU then I don't know what is.

Clearly.

https://support.apple.com/en-au/HT208544

So anti GPU, there’s a whole technology in macOS, that’s part of the core OS, and runs even if you don’t have thunderbolt ports which is exclusively for GPUs.

Just not Nvidia, because Nvidia refuse to become a faceless component supplier, and keep acting like Nvidia users are CUDA users, and Nvidia customers, and for whom the computer and OS are the elastic replaceable thing.

Apple wanted a component maker, not a platform builder. AMD have no interest in building software platforms. They are however very interested in being a contract component builder.

If people want to live in the reality distortion field that's their business.

Please, were you actually using these machines, and buying GPUs for Macs when this was all happening? Because, you’re talking about it as if it’s all deep history, that you have to look into historical archives to find, whereas I remember all this from when it happened. I remember reading the ATI press release that announced they were releasing a new GPU that would be featured in a new soon-to-be-announced unprecedented form factor G4 machine, and I remember the Macworld launch in the week or so following, where all the machines were Nvidia based, because Steve flipped out at them ruining his surprise.

If you want to find the source of the Nvidia crash issues story, search here, I’m pretty sure it has been mentioned in the past, probably in an argument between MVC and someone advocating for AMD / 2013 Mac Pro.
 
Clearly.

https://support.apple.com/en-au/HT208544

So anti GPU, there’s a whole technology in macOS, that’s part of the core OS, and runs even if you don’t have thunderbolt ports which is exclusively for GPUs.

Just not Nvidia, because Nvidia refuse to become a faceless component supplier, and keep acting like Nvidia users are CUDA users, and Nvidia customers, and for whom the computer and OS are the elastic replaceable thing.

Apple wanted a component maker, not a platform builder. AMD have no interest in building software platforms. They are however very interested in being a contract component builder.

To me, locking out 50% of the GPU market is anti-GPU yeah. It doesn't matter to me what NVIDIA's ambitions were and how that clashed with Apples. That's my opinion on that topic.

Please, were you actually using these machines, and buying GPUs for Macs when this was all happening? Because, you’re talking about it as if it’s all deep history, that you have to look into historical archives to find, whereas I remember all this from when it happened. I remember reading the ATI press release that announced they were releasing a new GPU that would be featured in a new soon-to-be-announced unprecedented form factor G4 machine, and I remember the Macworld launch in the week or so following, where all the machines were Nvidia based, because Steve flipped out at them ruining his surprise.

I remember the same press release and how Steve got pissed and pulled the deal. It's not at all news to me, I was there reading the articles at the same time as you.

I had dual 30" Cinema Displays, I had to buy the NVIDIA 6800 Ultra with the Dual-Link DVI connectors to get 2560x1600 60Hz on them on a PowerMac G5. I also had a 12" iBook with I want to say an ATi 9200 Mobity. I was there for all that. I don't know what this has to do with the topic mind.

If you want to find the source of the Nvidia crash issues story, search here, I’m pretty sure it has been mentioned in the past, probably in an argument between MVC and someone advocating for AMD / 2013 Mac Pro.

I'm just asking for a source of the rationale you gave that Apple decided to stop signing NVIDIA's drivers because they caused systems to crash. I don't think the onus should be on me to prove your point for you.
 
Last edited:
To me, locking out 50% of the GPU market is anti-GPU yeah. It doesn't matter to me what NVIDIA's ambitions were and how that clashed with Apples. That's my opinion on that topic.

Well so long as you’re only representing as your opinion, that’s fine. Clearly, Apple was not “anti GPU” because they produced a whole technology stack specifically dedicated to user-upgraded retail GPUs, along with the documentation for them, WWDC presentations etc.

They just didn’t invite Nvidia to the party.

You may as well say Apple are “anti CPU” because they didn’t bless AMD CPUs to run macOS, given they’re the other big CPU player.

I'm just asking for a source of the rationale you gave that Apple decided to stop signing NVIDIA's drivers because they caused systems to crash. I don't think the onus should be on me to prove your point for you.

Go search for references to Nvidia here (nowhere else on the Internet would be wasting time arguing over Nvidia Vs. Apple), it shouldn’t take more than a week to go through them - I’m sure you’ll find the thread, presuming it didn’t get pruned for degrading into a flame war.
 
Well so long as you’re only representing as your opinion, that’s fine. Clearly, Apple was not “anti GPU” because they produced a whole technology stack specifically dedicated to user-upgraded retail GPUs, along with the documentation for them, WWDC presentations etc.

They just didn’t invite Nvidia to the party.

You may as well say Apple are “anti CPU” because they didn’t bless AMD CPUs to run macOS, given they’re the other big CPU player.

That's right, it's my opinion, as yours is your opinion.

Blocking half of the GPU market is anti-GPU. Writing a guide for people showing them how to install a third-party graphics card and use it with macOS with the least amount of issues is pro-GPU. Both things can be true, it's not always black and white.

Just like if Microsoft were to block a specific games company from releasing games on the XBOX. That would be anti-gaming. Even though they allow all these other games from all these other developers and publishers which would be considered pro-gaming. Both things can be true. Apple has not been a great pro-GPU citizen, they've more often than not been anti-GPU in their moves. That's my opinion on the matter.

Go search for references to Nvidia here (nowhere else on the Internet would be wasting time arguing over Nvidia Vs. Apple), it shouldn’t take more than a week to go through them - I’m sure you’ll find the thread, presuming it didn’t get pruned for degrading into a flame war.

Uhuh, okay.
 
Last edited:
It amuses me how that has turned into a meme that people think is ironic because we're now arguing about whether 192,000,000k is enough. It was originally a reference to the IBM PC only supporting 640k of RAM whereas the 8086 could support a whole megabyte of RAM. It wasn't even that stupid a decision in 1981 when 64k was a lot of RAM - no more "stupid" than the original Mac only supporting 128k - and only became an issue because IBM managed to lock the PC world into an already obsolete late-70s PC design with a stop-gap processor lacking up-to-date memory management. It really became irrelevant as soon as PCs were able to use protected mode (or, better, just not use x86 - like the Mac).


I don't need or want a unicorn - but you're welcome to have one provided you can find someone to create one for you.

Good luck with that.

The reason you can't have an Apple Silicon Mac with 1TB of RAM and 8 full-bandwidth PCIe slots with NVIDIA 4090 support (which not even the 2019 Mac Pro had - some slots still had to share) is that Apple Silicon doesn't do that.

Apple Silicon is hugely successful at supporting useful and distinctive products from the iPad and Vision Pro up to the Mac Studio Ultra with only two basic die designs that themselves are developments of mass-market iPhone core designs - and that's where Apple makes most of their (non-iPhone) money. Those applications take advantage of the all-in-one SoC, the efficiency of unified RAM and the high performance by integrated GPU standards of the integrated GPU and absolutely slays anything that takes advantage of the media/neural engines. It is not designed to compete with Xeon-W - let alone Threadripper. Even the hypothetical M2 Extreme doesn't close that gap - and the mythical M3 isn't going to improve things by the required order of magnitude.

Apple could sink a billion or so into developing a new die with ARM ISA, 128 PCIe 5 lanes and support for 2TB of external ECC RAM, maybe they could even bury the hatchet with NVIDIA (who would be delighted to help Apple become a competitor in the ARM heavy metal market) so you could plug in a pair of 4090s... and they'd have... well, a Me Too workstation tower that performed about as well as any other system with dual 4090s and 2TB of ECC DDR5 (if NVIDIA did a good job of writing the drivers). Of course, it would have nice, cool-running, low-power ARM processor which you'd barely notice alongside the 1kW being kicked out of those GPUs. That's assuming that everybody was fine with not being able to run x86 code. They'd then have to recoup those development costs from a small and shrinking pool of people who need that level of power and can't change their MacOS-based workflows to take advantage of the far wider choice of workstation-class PC hardware that is already out there or just rent what they need in the cloud. It would cost a fortune - bear in mind that Intel can get away with charging $8000 for just the CPU that the higher-end 2019 Mac Pro uses, and they have a far larger market over which to spread their development costs.

...so, basically, it's simply not in Apple's economic interest to sell you your unicorn. Instead, they're offering you a pony with a pointy seashell glued to its head - AKA the 2023 Mac Pro - which I agree is a bit of a silly charade. There's definitely a niche for a Mac Studio Ultra with PCIe slots - and what they're offering is a considerable improvement over a PCIe enclosure (16 PCIe 4 lanes shared with 64 lanes worth of slots c.f. typically 4 PCIe 3 lanes shared between 32 lanes-worth of slots, plus less clutter) - and I'm sure the development costs were a fraction of what it would cost to design a new die just for the MP. It still seems ridiculously expensive - but economies of scale are a big thing with electronics and I don't see these flying out of the shops (plus, they've kept the grotesquely over-engineered 2019 case). OTOH, some people would pay $7000 for a handbag, so...
 
It is just so incredibly stupid that Apple couldn't add some DDR5 slots that would be much slower for some extended memory in addition to the fast SOC RAM. This isn't an unsolvable problem like the lack of PCIe GPU support, it's just something they refused to do despite that it's the ONE thing some audio pros really need, and the one thing that would have made this new Mac an actual "pro" machine. PCIe slots can be replaced by thunderbolt devices, despite what so many people claim... audio hardware has largely moved on to thunderbolt solutions, and there's always external thunderbolt PCIe cages if you want to keep using non-thunderbolt devices. But RAM for sample libraries is actually something that's important for some people and there's no easy way around that. Some computers since the '80s have had different banks of RAM that function at different speeds or have more direct access than other memory... this really is just negligence and denial on the part of Apple and it's going to lead to the Mac Pro line being eventually discontinued because it won't be taken seriously anymore. Crying about lack of Nvidia cards is a waste of time, as that ship sailed LONG ago... the RAM situation is the real problem, and it would have been so simple for that not to be the case. What a joke.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.