Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
https://www.macrumors.com/2023/06/11/apple-exec-discusses-mac-pro-lack-of-egpu-support/

I think the answers to our questions are on this interview. And the future is not so promising and bright for us who have relied on the Apple solutions for professional needs. Actually coming events cast their shadows before. I think we are quite blind to see what's going on at the Apple side.

I kind of agree with this but kind of do not. The apology tour for the trashcan was pretty intense. The 7,1 was pretty expandable as if they learned their lesson. Either the 8,1 is a sardonic reply of 'here's your slop cards, we're going to pretend this is what you want but know it isn't', perhaps better? is 'we crapped the bed with the M2 extreme and this is a stop gap, we're feverishly trying to fix these thing with the next iteration'.

Of course I'm praying it's the latter case. But you may be right. Tough to tell. Their answer was basically, sorry, 'go f' yourselves on renting cloud space' was really tone deaf and outright antagonist rather than being apologetic. And this should be the "all apologies" Mac, because one thing it aint, is a Mac Pro.
 
  • Like
Reactions: backtopoints

backtopoints

macrumors newbie
Dec 9, 2022
18
40
I kind of agree with this but kind of do not. The apology tour for the trashcan was pretty intense. The 7,1 was pretty expandable as if they learned their lesson. Either the 8,1 is a sardonic reply of 'here's your slop cards, we're going to pretend this is what you want but know it isn't', perhaps better? is 'we crapped the bed with the M2 extreme and this is a stop gap, we're feverishly trying to fix these thing with the next iteration'.

Of course I'm praying it's the latter case. But you may be right. Tough to tell. Their answer was basically, sorry, 'go f' yourselves on renting cloud space' was really tone deaf and outright antagonist rather than being apologetic. And this should be the "all apologies" Mac, because one thing it aint, is a Mac Pro.
Unfortunately it is all clear dear ZombiePhysicist. They are simply saying "Take it or leave it!". They very well know that the GPU side of the SOC is incomparable to 4080/90 on most of gpu based rendering tasks. I don't think this is different with the m2 Ultra. No need to talk about the Ram. You know how much ram we need for simulation based tasks. They all know that better than us and now they are saying WE DO NOT CARE ABOUt YOU! and go and build a workstation for yourself. Unfortunately this is the end of the road.
 
  • Wow
Reactions: ZombiePhysicist

jmho

macrumors 6502a
Jun 11, 2021
502
996
It's times like this that Apple's secrecy is incredibly annoying.

We don't know if their "give me SoCs or give me death" attitude is because they truly believe that it's the future and that they're playing the long game where they'll eventually be able to dominate the GPU space, or if they literally just don't care about pros and think that the future of computing is going to be strapped to your face.
 
  • Haha
Reactions: OS6-OSX

Longplays

Suspended
May 30, 2023
1,308
1,158
It's times like this that Apple's secrecy is incredibly annoying.

We don't know if their "give me SoCs or give me death" attitude is because they truly believe that it's the future and that they're playing the long game where they'll eventually be able to dominate the GPU space, or if they literally just don't care about pros and think that the future of computing is going to be strapped to your face.
Economies of scale and better margins.

Apple's leveraging their SoC tech and applying it to anywhere that can use it.

Would you want to hazard a guess how many Mac Pros get shipped annually?
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
Yeah that's all well and good, but I'm talking about how the relative GPU performance in these SoCs is going to evolve over time.

Currently Apple is sort of able to cover the lower half of nVidia's card spectrum with their M/Pro/Max/Ultra range.

My question is, when we're on say the M5 and Apple are still only putting out M/Pro/Max/Ultras, does Apple expect us to still be getting "kinda good laptop GPU performance", or do they expect that by M5 the Ultra is actually going to be able to destroy XX90 cards in real world use and we're all going to be lording it over PC users with our SoC based computers?
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Yeah that's all well and good, but I'm talking about how the relative GPU performance in these SoCs is going to evolve over time.

Currently Apple is sort of able to cover the lower half of nVidia's card spectrum with their M/Pro/Max/Ultra range.

My question is, when we're on say the M5 and Apple are still only putting out M/Pro/Max/Ultras, does Apple expect us to still be getting "kinda good laptop GPU performance", or do they expect that by M5 the Ultra is actually going to be able to destroy XX90 cards in real world use and we're all going to be lording it over PC users with our SoC based computers?

Just wait for the non-synthetic benchmarks.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Yeah that's all well and good, but I'm talking about how the relative GPU performance in these SoCs is going to evolve over time.

Currently Apple is sort of able to cover the lower half of nVidia's card spectrum with their M/Pro/Max/Ultra range.

My question is, when we're on say the M5 and Apple are still only putting out M/Pro/Max/Ultras, does Apple expect us to still be getting "kinda good laptop GPU performance", or do they expect that by M5 the Ultra is actually going to be able to destroy XX90 cards in real world use and we're all going to be lording it over PC users with our SoC based computers?

What's really worrying about the Talk Show Live video is the part where they talk about GPU performance, and Joz basically says (and I'm paraphrasing here) that they aren't trying to beat Nvidia.

Beating Nvidia is the entire workstation game right now. With everything going GPU - you have to beat, or at least compete with Nvidia. You can try with cards, you can try with SoCs, you can try with hamsters doing long division inside the computer... but that's where almost everything is. And AI is going to make that worse.

So I'm hoping that's just marketing covering Apple's ass (because again, it seems like there were engineering difficulties with this machine.) Because otherwise that's really concerning. And that's not even a card vs SoC thing.

It's also concerning because so much of Apple Silicon has been about performance. So it's bizarre that now you get to the top end and Apple is suddenly seemingly saying "We're not in this for performance."
 

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
What's really worrying about the Talk Show Live video is the part where they talk about GPU performance, and Joz basically says (and I'm paraphrasing here) that they aren't trying to beat Nvidia.

Beating Nvidia is the entire workstation game right now. With everything going GPU - you have to beat, or at least compete with Nvidia. You can try with cards, you can try with SoCs, you can try with hamsters doing long division inside the computer... but that's where almost everything is. And AI is going to make that worse.

So I'm hoping that's just marketing covering Apple's ass (because again, it seems like there were engineering difficulties with this machine.) Because otherwise that's really concerning. And that's not even a card vs SoC thing.

It's also concerning because so much of Apple Silicon has been about performance. So it's bizarre that now you get to the top end and Apple is suddenly seemingly saying "We're not in this for performance."

Mac Rumors covered the first statement :
"Fundamentally, we've built our architecture around this shared memory model and that optimization, and so it's not entirely clear to me how you'd bring in another GPU and do so in a way that is optimized for our systems," Ternus told Gruber. "It hasn't been a direction that we wanted to pursue."

But I find the 2nd statement worse. It was basically 'f'off and go cloud yourself, we don't give a f' starting around the 22:40 marker:

Gruber: "Is the idea they're best off renting GPUs on the cloud..."
Joz: "We have our strengths, They have theirs. They're doing a good job. You know, great for them, but we have stuff no one has too."

Yea, like 7 slots where if you put in 1 card there is no more bandwidth left for the other 6. Like the lack of 'exotic' error correction memory. The inability to employ any graphics cards in your PCI slots. Processors that cannot compete with midlevel gaming PCs. Graphics performance that is last gen and not upgradable. RAM that is not upgradable. "your strengths"... :rolleyes:
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,448
Beating Nvidia is the entire workstation game right now.
People here seem to be struggling with denial: Apple aren't interested in the workstation game any more.

The new MP is for people who need a Mac Studio that can take specialist storage, networking, digitisers etc. that need x8 or x16 PCIe slots. All but a few Apple die-hards who need NVIDIA-class GPUs have already left the building.

Apple doesn't care if you train your AIs and pre-render your 3D graphics on AMD/NVIDIA big iron in the server room or on the cloud (which is where that sort of thing is heading, and where Apple doesn't have a horse in the race against NVIDIA, AWS et. al.) because where all those services will have to be delivered is to mobile devices, laptops and SFF computers that need to be small, light and low power - which is what Apple Silicon was designed for. and where Apple do actually have an edge over the competition.

I'm slightly skeptical as to whether, in a year's time, we'll all be walking around wearing Apple Goggles - but that's Apple's bet for the next iPhone - and people sure as heck aren't going to be walking around with freaking 4090s or Grace Hoppers strapped to their faces! Meanwhile, people most certainly are buying iPhones, iPads and Macs to access cloud services which haven't been hosted by Apple kit since... ever.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,448
But I find the 2nd statement worse. It was basically 'f'off and go cloud yourself, we don't give a f' starting around the 22:40
People aren't going to f off to the cloud because Apple tells them too. They're going to f off to the cloud (if they haven't already) because its going to rapidly become the better solution for future developments (...such as training AI where all the training data already lives in the cloud, or where you're collaborating with a dozen colleagues working from home, or where you only need the computing power 3 months of the year, or...)
 
  • Like
Reactions: Adult80HD

theluggage

macrumors G3
Jul 29, 2011
8,015
8,448
Yea, like 7 slots where if you put in 1 card there is no more bandwidth left for the other 6.
Like the lack of 'exotic' error correction memory. The inability to employ any graphics cards in your PCI slots. Processors that cannot compete with midlevel gaming PCs. Graphics performance that is last gen and not upgradable. RAM that is not upgradable.
...fixing which would require Apple to either (a) sink gigabucks into developing a dedicated processor for just their smallest-selling model of computer or (b) keep supporting an x86/PCIe graphics system which would be exactly as powerful as whatever Intel/AMD/NVIDIA were selling for the competition to put in cheap tin boxes.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
People here seem to be struggling with denial: Apple aren't interested in the workstation game any more.
In case I didn't make it clear in my first comment - there are implications beyond the workstation market. If I follow Apple's logic to it's extreme - I shouldn't buy anything by a 13" MacBook Air. If I should just move to the cloud for any performance tasks why buy Macs based on performance at all?

It even makes their Mac gaming push a little weird. You have part of the company pushing gaming locally on Macs. And then you have Joz basically going "Well we're not here to compete with Nvidia's streaming services." So... they're not going to compete with GeForce Now? Why buy Mac games when they're already ceding that ground to GeForce Now?

(And I know the Mac Pro isn't a gaming machine, I just mean in general.)

The whole "we're not hear for performance" thing is super bizarre.

Maybe the nicest way out is they are talking about performance for specific workflows and specialized silicon. It's just... super weird.
 

Fnowjohn

macrumors newbie
Jun 11, 2023
3
2
Apple could have thrown out some TB controllers from the die and put bigger PCI-e v4 ones in, but they'd have less than 8 TB ports.

two x16 PCI-e v4 controllers is going to be more die area. Not the area they got now for TB subsystems.







There are no 'spare' Thunderbolt connections!




You are trying to throw away information you already have. TB controllers are consumed, so they are not it.
I have been suggesting this solution since the Mac Studio was first announced most of us would be happy to sacrifice a thunderbolt port (or 2) to gain a fully pledged PCIe gen 5 slot, or dual industry standard m.2 slots.

Apple seems intent on wholely kneecapping the Apple silicon lineup. What is the point of a 800GB/s bus when the solitary internal SSD can only handle at best 7GB/s, and thunderbolt ports only half that each.
There is no way to keep the CPU and GPU fully saturated with data. I suppose that is why Apple silicon only clocks out at a relatively slow rate, ant such low temps. The board architecture fails to deliver the raw data those chips need to really perform.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
It's times like this that Apple's secrecy is incredibly annoying.

We don't know if their "give me SoCs or give me death" attitude is because they truly believe that it's the future and that they're playing the long game where they'll eventually be able to dominate the GPU space, or if they literally just don't care about pros

Apple is playing the long game. That really, really , really shouldn't be a mystery. The hardware guy in charge of Vision Pro was on Gruber's 'WWDC meet the executives show" this year. He was also on the show in 2018 when the big updates to ARKit were emphasized at WWDC 2018.

Apple killed off 32-bit apps in preparation to the transition ( a year , or two? , ) before. ( side effect of nuking lots of non Metail OpenGL/CL apps. )
The T2 was a 'Transition' chip for multiple years in Apple last gasp Intel products ( 2017-2020).
Apple deprecates OpenGL/OpenCL and doesn't touch Vulkan with ten foot pole. Meta or nothing in a several years basically announced.
Day one of transition the new Metal features session GPU coverage score card read: macOS on Intel 2 ( Intel , AMD ) , macOS on Arm 1 ( just Apple GPU all by its lonesome).


it really isn' about not caring about pros. It is more running the platform like the gaming consoles where offering an environment where there is only a limited subset of GPUs to target so that can spend more time optimizing then porting to minor forks. Metal is more of a thinner layer API (relative to OpenGL ) . To get the most out of the calls the application needs to know something about the GPU.

It is like the old boss quote you have about "don't bother optimizing on PC ... those users will just throw more hardaware at it. " mindset. Apple isn't doing that. That is really orthogonal to their attitude to pro end users. ( what is really more trying to drag the pro application developers somewhere than the end users of those apps. )




and think that the future of computing is going to be strapped to your face.

I don't think Apple thinks iPhone , iPad , and Macs are going to implode into just the Vision Pro at all.
I think Apple does want synergy across platforms those. That is a bigger win for them than chasing after a narrow niche that is forked way off the road from the rest .

1Billion Apple sllicon users.
way less than 100M Intel macOS dGPU users. ( vast bulk of Intel Macs have an Intel GPU ; not Nvidia or AMD).
 

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
...fixing which would require Apple to either (a) sink gigabucks into developing a dedicated processor for just their smallest-selling model of computer or (b) keep supporting an x86/PCIe graphics system which would be exactly as powerful as whatever Intel/AMD/NVIDIA were selling for the competition to put in cheap tin boxes.

wrong.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,448
In case I didn't make it clear in my first comment - there are implications beyond the workstation market. If I follow Apple's logic to it's extreme - I shouldn't buy anything by a 13" MacBook Air. If I should just move to the cloud for any performance tasks why buy Macs based on performance at all?

Well, yes - that's exactly the way it is moving - personal computers as thin clients for the cloud. Still, you need a certain amount of local power for a nice fluid graphical UI - then there are applications in audio production/performance and video editing that need to run in close-to real time and still don't play nicely over a network and won't be moving into the cloud next week - but those things are pretty much the sweet spot for Apple Silicon, while high-end workstation workflows that need buckets of RAM and specialist GPUs are not the best use of the technology.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
I'm slightly skeptical as to whether, in a year's time, we'll all be walking around wearing Apple Goggles - but that's Apple's bet for the next iPhone

Even folks drinking Cupertino kool-aid can't be that high. A $3K device with 2 HOUR battery time is not going to be direct substitutable good for a $800 one with 10-20 hrs. Just not going to happen.

Vision Pro has a chance to cut into all those purchases of XDR by folks who really didn't need a 'close to' reference monitor in the first place. But iPhones? No. Vision Pro even has the advantage of not needing a $999 stand. :)
[ Also I'm pretty dubious of the long term ergonomics of using this for 6-8 hours a day, even while plugged in. That is really comes with an over-the-head strap suggests that the weight is a little high for good ergonomics. And impacts on eye focus distance shifting (or lack there of). ]



Folks thinking that Apple is going to whittle down future versions so the Vision is just as light as normal glasses ... blah blah blah. Same boat as whitting down Mac Pro so it is a MacBook !2" . Just not going to happen and still be the same product coverage.
 

seek3r

macrumors 68030
Aug 16, 2010
2,561
3,772
Even folks drinking Cupertino kool-aid can't be that high. A $3K device with 2 HOUR battery time is not going to be direct substitutable good for a $800 one with 10-20 hrs. Just not going to happen.

Vision Pro has a chance to cut into all those purchases of XDR by folks who really didn't need a 'close to' reference monitor in the first place. But iPhones? No. Vision Pro even has the advantage of not needing a $999 stand. :)
[ Also I'm pretty dubious of the long term ergonomics of using this for 6-8 hours a day, even while plugged in. That is really comes with an over-the-head strap suggests that the weight is a little high for good ergonomics. And impacts on eye focus distance shifting (or lack there of). ]



Folks thinking that Apple is going to whittle down future versions so the Vision is just as light as normal glasses ... blah blah blah. Same boat as whitting down Mac Pro so it is a MacBook !2" . Just not going to happen and still be the same product coverage.
An M2 Air can outperform a 2019 MP in non-memory size constrained or (depending on the workload) gpu-bound tasks. Another generation or two and itll surpass most of those too for anyone not using like 1TB of ram (which the current mp cant match either). isnt that basically whittling down a mac pro into the smallest macbook available?
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
What's really worrying about the Talk Show Live video is the part where they talk about GPU performance, and Joz basically says (and I'm paraphrasing here) that they aren't trying to beat Nvidia.

They aren't trying to be Nvidia at AI training. Not that they are not trying to be Nvidia at everything Nvidia does. Just were Nvidia has a giant 'moat' around their product they are not.

Which isn't 'new'. Apple stopped signing Nvidia drivers years ago at this point. In part, because Nvidia was all more so intent on building that moat than doing what Apple wanted them to do.

The workstation market isn't all Nvidia any more than Bitcoin is all of finance. Nvidia's AI business is trendy , 'sexy hot' topic , but not the single cornerstone. And it isn't workstations.

" ... SMC reportedly pledged to process an extra 10,000 CoWoS wafers for Nvidia throughout the duration of 2023. Given Nvidia gets about 60-ish A100/H100 GPUs per wafer (H100 is only slightly smaller), that would mean an additional ~600,000 top-end data center GPUs. .. "

Those 600,000 H100 are mostly not going into worksations. Most H100 modules are OAM standard ( or Nvidia's proprietary variation of OAM). Those are not PCI-e cards.


Apple's run rate on Mac Pros probably isn't even a 1/3 of that. Apple talked about AI/ML in that session in the Craif F. portion of the talk. Apple is leaning way more into inference than training. And there Nvidia isn't really the exclusive players. Lots of folks are doing inference without Nvidia. That segment is only growing, not shrinking.




It's also concerning because so much of Apple Silicon has been about performance. So it's bizarre that now you get to the top end and Apple is suddenly seemingly saying "We're not in this for performance."

Stepping up to PCi-e v4 isn't a performance increase over the last MP ? It is still performance.

At one point (in the 90's I think) Jobs said something along the lines of ' have to get past the notion that for Apple to win Microsoft has to loose". Folks going down 'rabbit holes' on Nvidia are doing the same thing.

Apple is taking on Nvidia + Intel + AMD + Qualcomm + Microsoft ... they can't afford to get drawn into a "land war in Asia" like in the game of Risk. Where trying to do everything for everybody. Apple isn't making luggable desktop replacement laptops .... are they doomed ? Nope. Same thing here. Smartly pick your battles where you have a tactical advantage. ... and leave the the "do everything for everybody" thing for other folks.

Chasing Nvidia because they got caught a sizzle stock price bump is the same kind of reasoning that got Apple into the "Apple Car" that is likely a bridge to nowhere. It isn't 'create a useful product ' focused.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
An M2 Air can outperform a 2019 MP in non-memory size constrained or (depending on the workload) gpu-bound tasks.

Whoppty do ... kick sand at a 4 year old product. AMD is suppose to release details this week about the MI300 ( A , X , and C models). When the M4/M5 Air are whipping those then Apple can talk lots of smack.

It isn't like other folks are not also on the same path of cutting down on superfluous copying overhead and using the latest TSMC nodes.
 

seek3r

macrumors 68030
Aug 16, 2010
2,561
3,772
Whoppty do ... kick sand at a 4 year old product. AMD is suppose to release details this week about the MI300 ( A , X , and C models). When the M4/M5 Air are whipping those then Apple can talk lots of smack.

It isn't like other folks are not also on the same path of cutting down on superfluous copying overhead and using the latest TSMC nodes.
I think you may have completely missed my point…
 
  • Like
Reactions: Rnd-chars

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
In case I didn't make it clear in my first comment - there are implications beyond the workstation market. If I follow Apple's logic to it's extreme - I shouldn't buy anything by a 13" MacBook Air. If I should just move to the cloud for any performance tasks why buy Macs based on performance at all?

If folks followed Apple's logic to the extreme Mango would have been right about the return of the 'tube' Mac Pro and the 2017-2019 crowd spinning yarns about the 'lego' Mac Pro would have been right also. 'Apple's logic in the extreme' more often turns out to be folks out side of Apple logic far more so than Apple's.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
It even makes their Mac gaming push a little weird. You have part of the company pushing gaming locally on Macs. And then you have Joz basically going "Well we're not here to compete with Nvidia's streaming services." So... they're not going to compete with GeForce Now? Why buy Mac games when they're already ceding that ground to GeForce Now?

Errr. that is largely out of context. Gurber ask a question about AI and ML. ( not gaming !!! ). Apple doesn't compete with Nvidia cloud AI training servers. Duh? The sky is blue and there absolutely NOTHING new there at all. Apple never claimed to be either a general compute cloud service operator even in the Intel era ( they are not no matter what the component parts ) nor trying to be 'king' of the AI training world ( because of trying to catch the hot sizzle ChatGPT sparked. ).

Apple didn't run off and chase cryptocurrency mining either when it is was the 'rage' . Did Apple run off and build the best mining rigs ever? No.

Essentially 'same thing different day' going on here.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Mac Rumors covered the first statement :


But I find the 2nd statement worse. It was basically 'f'off and go cloud yourself, we don't give a f' starting around the 22:40 marker:

Gruber: "Is the idea they're best off renting GPUs on the cloud..."
Joz: "We have our strengths, They have theirs. They're doing a good job. You know, great for them, but we have stuff no one has too."

Again out of context. Gruber made a fallacious statement of effectively "GPU == AI training processing". Really a rather shallow read on the topic which eventually gets outlined later by all the layers of AI going on in the Vision Pro.
So the "renting GPUs on the cloud" is really "renting GPUs on the cloud to do AI". Nvidia offers AI cloud compute. Apple doesn't. Apple didn't in the Intel era. Apple isn't in the M-series era. No deep mystery there unless drifing off into alternative universes.

Apple isn't in the AWS/Azure/Google Cloud business at all. Apple has some modest iCloud services for your devices that largely do not involve 'heavy compute services". And they have XCode Cloud.... which is not particularly different from what folks like MacStadium/miniColoc/etc have offered for years. "Rent a mac at a remote location". ( XCode cloud has specifics built into XCode to make all that remote build/test/etc process easier/clearner. But really renting headless remote Macs. )


Folks are smoking a lot of strong something if think Apple is going to get into the business of renting out computer subcomponents on the Internet. it is really not what they do.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.