Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Lord Blackadder

macrumors P6
May 7, 2004
15,678
5,511
Sod off
I find myself constantly going back and forth on my assessment of Apple's pro tower line and where they're taking it...I recently retired my Mac Pro 5,1 - I simply don't have the time to upgrade the firmware, GPU and run through the hassles of keeping it going without it being a security risk. After 4 decades of having an Apple as my primary day-to-day, I am building a PC. Ugh - mostly this was driven by professional requirements. But I've been using both Windows and Mac OS since the 90s so it's not like I'm 'switching.'

But the bottom line for me is that upgradeable Mac towers used to be within my reach; now they are too expensive. I don't think I'll ever be able to afford a new Mac desktop with upgradeable internals, IF Apple even keeps building them. And Apple end-of-lifing my 5,1 was a kick in the teeth I won't soon forget. I had a lot of money in that box.

The counterargument to that, of course, is that Pro Macs have always been very expensive, and my 5,1 was actually a tremendous deal....the Mac IIci I still fire up and play with occasionally cost over $6,000 in 1989 (not adjusting for inflation), after all....that's solid used car money even today.
 
Last edited:

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Not familiar with that. What was the exact offer and is there a reference of sorts?
I doubt Apple or Nvidia ever publicly commented, but my assumption has always been that Apple doesn’t want CUDA on macOS, as pro software developers would default to using that over Metal. Both killing Metal adoption AND making Apple perpetually dependent on Nvidia GPUs. Nvidia refused to leave CUDA out of their drivers, and there we are. And without CUDA, Nvidia GPUs lose much of their advantage anyway.

I’m sure Nvidia are douches, but I doubt that’s ultimately relevant to the ‘rift’. Doesn’t hurt that AMD are likely much easier to strong-arm into volume discounts, either.
 
  • Like
Reactions: maikerukun

Lord Blackadder

macrumors P6
May 7, 2004
15,678
5,511
Sod off
I’m sure Nvidia are douches, but I doubt that’s ultimately relevant to the ‘rift’. Doesn’t hurt that AMD are likely much easier to strong-arm into volume discounts, either.

Nobody comes out of this looking good. But ultimately Nvidia's current dominance is bad for the consumer. we need t at least two (preferably more) roughly equal GPU options at all performance levels or else the dominant player starts becoming a tyrant.
 
  • Like
Reactions: maikerukun

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Not familiar with that. What was the exact offer and is there a reference of sorts?

You can get a tellig of the story here:


Salient part is this:

Adobe embraced CUDA even on macOS and thus earned CUDA a favored position among creative professionals, especially those using the Adobe Suite. Adobe went as far as to build CUDA specific applications for Nvidia GPUs.

...

Many Mac professionals invested in Nvidia hardware as AMD's offerings generally paled against Nvidia at the higher end, and CUDA offered a lot more performance in Adobe video applications like Premiere Pro and After Effects.

...

CUDA represented a significant problem for Metal adoption. In order to get professional applications on board with Metal, they had to cut out CUDA, and my guess is that NVIDIA was not willing to give up CUDA in its driver.

So the offer is clearly "knife CUDA for macOS, and only ship an OpenCL / Metal driver for your cards on macOS", given Nvidia had publically stated they had a working Metal driver, but Apple refused to authorise their developer certificate for 10.14 forward, it's not a stretch to make the connection that Apple believed Nvidia could always offer a CUDA aftermarket download, or make a CUDA library a thing developers could implement directly within their apps.

The simple fact is Metal wasn't good enough to stand on its own merits, and as is Apple's standard operating procedure, when they have a new tech they want used, their first action is to shield it from competitive pressures, to juice its adoption curve*, by hobbling its competition.

*This is why you don't get security updates to older iOS devices that are capable of running newer versions of iOS.
 

AndreeOnline

macrumors 6502a
Aug 15, 2014
704
495
Zürich
I doubt Apple or Nvidia ever publicly commented, but my assumption has always been that Apple doesn’t want CUDA on macOS, as pro software developers would default to using that over Metal.
The train for CUDA left the station a looooong time ago, but we did have support for it.

Metal is what Macs will use going forward, so there is no room for CUDA today. Now it's more a question of Apple or AMD developing dedicated hardware to close the gap that Nvidia has created—using Metal.
 
  • Like
Reactions: maikerukun

AndreeOnline

macrumors 6502a
Aug 15, 2014
704
495
Zürich
So the offer is clearly "knife CUDA for macOS, and only ship an OpenCL / Metal driver for your cards on macOS", given Nvidia had publically stated they had a working Metal driver.
It seem like an outside view with personal anecdotes and speculation. That's all fine—many have given their own take.

I don't think "the offer is clearly..." since there simply is no real evidence or information.

But the period when Nvidia went public and said they had a driver ready... that was waaay later, long after things had gone sour in the talks between them and Apple.

Personally I think it's all down to personal issues among high level representatives (perhaps even the highest-ups in Apple) after Nvidia delivered semi-faulty components and later wouldn't own up to it.

There's something in all of this that gives me the feeling that Nvidia isn't a company that's "all about doing the right thing". But that is just my speculation.
 

edanuff

macrumors 6502a
Oct 30, 2008
578
259
Meanwhile they're secure in the knowledge that people using Nvidia GPUs in their machines are Nvidia customers, not HP, Lenovo, Puget etc.

And that’s probably the main reason why Apple will never put in Nvidia. They have no interest in just being the most expensive case you can put your Nvidia card into.
 
  • Like
Reactions: maikerukun

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
And that’s probably the main reason why Apple will never put in Nvidia. They have no interest in just being the most expensive case you can put your Nvidia card into.

Correction - no interest in competing to be the most performant operating system to put your Nvidia card into.
 
Last edited:
  • Like
Reactions: mode11

AndreeOnline

macrumors 6502a
Aug 15, 2014
704
495
Zürich
And thanks to this forum I easily upgraded to Window 11. It runs flawlessly – Win 11 is super smooth.
It's been a while since I last tried Windows and maybe I should just install it again to see what's what.

Last time I thought about it was after Windows 11 was out, but the recommendation from many was to go with Windows 10. Has that changed now?

Is Windows 11 considered to be the way to go? Asking specifically for use on a Mac Pro 2019 in this case.
 
  • Like
Reactions: maikerukun

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
There's something in all of this that gives me the feeling that Nvidia isn't a company that's "all about doing the right thing". But that is just my speculation.

Sure, but if you look at the bigger picture, it's pretty clear that Apple's goal was to make Nvidia a dumb pipe to Metal, so they could be seamlessly replaced by AMD when Apple plays them off against each other, as just being anonymous component suppliers. That's a crap business to get into for the dominant player in the GPU industry, especially wen it requires them to focus on a secondrate software stack, when they can do better themselves.
 

AndreeOnline

macrumors 6502a
Aug 15, 2014
704
495
Zürich
Sure, but if you look at the bigger picture, it's pretty clear that Apple's goal was to make Nvidia a dumb pipe to Metal, so they could be seamlessly replaced by AMD when Apple plays them off against each other, as just being anonymous component suppliers.
I can't say that I see that. What would lead to that conclusion, apart from speculation from others that thinks that?

I can see that Apple wanted to have Nvidia GPUs in their computers up to a point and then things went sideways.
 

smckenzie

macrumors member
May 7, 2022
97
106
It's been a while since I last tried Windows and maybe I should just install it again to see what's what.

Last time I thought about it was after Windows 11 was out, but the recommendation from many was to go with Windows 10. Has that changed now?

Is Windows 11 considered to be the way to go? Asking specifically for use on a Mac Pro 2019 in this case.
Prior to getting my 7'1 a year ago I hadn't touched Windows since XP I think. When I tried Win 10 via Bootcamp I can't say I loved it, seemed like not much had changed, still hated Windows Explorer with a passion but I did notice how "snappy" it was. I didn't think Win 11 was possible on a 7'1 until I found a post on here that had a link to a utility to install it un-attended. Install went without a hitch and I don't mind the OS honestly.

That being said I spend 90% of my time in a single app with some venturing out into Explorer and web browser. No email or anything like that but your mileage may vary :)
 

smckenzie

macrumors member
May 7, 2022
97
106
Mind you, I think I'll have to find a keyboard re-mapper as using Ctrl instead of Command might be a bridge to far ;)
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Tell me about it. I'm constantly hitting the Windows key when using PCs at work. It seems to be just where I expect the Alt key to be when using Maya.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
I can't say that I see that. What would lead to that conclusion, apart from speculation from others that thinks that?
I guess it ultimately doesn't matter. As Mac users, we have no choice anyway and must take what Apple give us (alas, we're not "computer users"). AMD likely just want the business more, and cut a better deal.

One thing is for certain - Apple didn't choose AMD because they make the best GPUs.
 

smckenzie

macrumors member
May 7, 2022
97
106
Tell me about it. I'm constantly hitting the Windows key when using PCs at work. It seems to be just where I expect the Alt key to be when using Maya.
Like Ctrl + E or Ctrl + R is way to much finger gymnastics for me
 

innerproduct

macrumors regular
Jun 21, 2021
222
353
Just watched a test and overclock of the new xeons
So, 400watts to 1000 watts. That is insane numbers. Remember that this is without any graphics cards etc.
I wonder when the 96 core threadripper 7 series comes along to annihilate this platform 😂.
 
  • Sad
Reactions: PineappleCake

PineappleCake

Suspended
Feb 18, 2023
96
252
Ok ok now the reviews are out I am having doubts. peak 1000 watts for the 56 core CPU is insane, ugh. If only Apple would support AMD cards on 8,1. Their CPUs are super efficient.
 
Last edited:

PineappleCake

Suspended
Feb 18, 2023
96
252
wow the new xeons are bad!!!

140w idle for the 56 core xeon vs 47w idle for the 64 core AMD TR Pro and worst past is the AMD TR Pro is faster and cooler and cheaper than the new xeons Yeah, I am gonna say it Intel dropped the ball huge.

Keep in mind the the AMD TR Pro is 2 years old.


CPU-Package-Power-w7-3465-vs-W-3365-vs-5975WX-1.png


VRay_Xeon_3400.png


Blender_Xeon_3400.png

Intel is truly DEAD. Now looking forward to Zen 4 TR Pro.




Now kinda glad that Apple ditched Intel.
 
Last edited:
  • Like
Reactions: ZombiePhysicist

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
I can't say that I see that. What would lead to that conclusion, apart from speculation from others that thinks that?

Because that's the way Apple treats ALL its parts suppliers. When they were using multiple suppliers of baseband radios in the iPhone, there was a huge hullabaloo over whether you could get the (IIRC) Intel, or Broadcom version, because (again IIRC) the Intel version had significantly worse wireless performance.

I can see that Apple wanted to have Nvidia GPUs in their computers up to a point and then things went sideways.

Right, but unless Apple has the emotional maturity of a Taylor Swift song (and there's certainly a good argument to be made from Apple's behaviour that they're perhaps not even that emotionally mature), things going sideways is just "we stop shipping your product in our product". It's not "we banish your product from our product ecosystem, and make sure none of our customers can use it either".

Clearly, CUDA was a strategic threat to OpenCL, then to Metal (and on the higher level, that made Nvidia a threat to Apple directly, as people using Nvidia GPUs tend to be more loyal to Nvidia than to their computer supplier), and Apple has never been able to compete on technological merits, because fundamentally they are a cowardly & insecure company whose response to competition is to neutralise it with (as is increasingly being established) illegal anticompetitive tactics, rather than let competition play out on the merits of the specific things in competition.
 
  • Like
Reactions: Oculus Mentis
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.