Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
No. I'm saying that the degree to which the GPU options on the 2023 Mac Pro are worse than the 2019 Mac Pro is much smaller than the degree to which both Mac Pros (let alone all Mac Pros) are poor choices for those prioritizing GPU performance above all else.

Again, you're taking a multi dimensional choice and folding it down to a single dimension: GPU benchmarks.

There are a lot of reasons people bought the 2019 Mac Pro for GPU work. Heck - Radeons even sell on the PC side. Saying the Mac Pro was a bad choice because it had Radeons continues to be nonsense.

Are there reasons someone might choose a GeForce based computer? Yes. But that doesn't make the Mac Pro a bad choice.

Yes. There are already SEVERAL of these threads out there. And no one is contributing anything new to them. It's the same five of you complaining about the same things over and over, ad nauseam..

Great. No one is forcing you to be here.

Relative to the Mac, sure. It was a great option. Relative to the rest of the industry, and for the cost? It's not competitive. Do recall that the original comment I replied to of yours was asserting that the 2019 Mac Pro was a competitive machine.

You keep saying this - and as I keep pointing out - competitive does not mean it's the best in every performance axis.

There was even a lot of utility in having upgradable GPUs year over year. A lot of users didn't care about raw performance but needed a machine where they could keep up with new versions of Metal.

Oh yeah? Give me numbers to back that up. I'm pretty sure that, for workstation-caliber graphics, NVIDIA vastly outnumbers AMD's marketshare and by a wide margin. Those cards are out of line for "the industry" of workstation computer graphics because most workstations ship with NVIDIA.

I'm kind of scratching my head here. That's both true and misses the point. A vast majority of CPUs sold are Intel. Does that mean AMD is non competitive?

You must not follow Apple news closely then.

Hah. Yes. I'm intimately familiar with everything in that video.

Apple spells it out for you and the entire rest of the world what they intended to do with GPUs across the ENTIRE Mac lineup. And that video predates the 2023 Mac Pro announcement by three years.

Sure. You can go back to my posts discussing this on this forum in 2020 when that video was posted. If not I'll re-summarize:
- That architecture technically does not rule out swappable GPUs. (Shared address space can work across removable GPUs.)
- That video does not rule out a competitive GPU (say like a rumored M2 Extreme that seems like it was cancelled.)

The fact that you didn't pay attention to it doesn't mean that it didn't happen.

I don't think your "facts" are quite what you think they are.

Apple's priorities are not YOUR priorities nor MY priorities nor the priorities of anyone else in this or any thread. This has always been the case. Why it's suddenly a problem for you and everyone else complaining about this specific machine as though this specific machine is the start of that systemic problem is beyond me. Some of us have been on the wrong end of these kinds of decisions made by Apple for decades.

Again - you're conflating things. Apple could have architected a machine would would have been competitive (M2 Extreme.) They chose not to. Longer reach - but they could have worked on removable GPUs that still had a unified address space. They didn't.

Apple Silicon is not incompatible with these things. Even within the guardrails of what they outlined in the 2020 video there are still options.

Again, you made the claim that the 2019 Mac Pro was feature for feature component for component comparable to the other contemporary Xeon workstations out there and have been moving the goalpost since.

I don't think this goalpost has moved at all. I mentioned that you could even put the same Nvidia cards in a Mac Pro and you freaked out.

Apple even mentions installed Nvidia cards in a tech doc here:

It is for most of the people that have complained in this and every other forum post concerning the 2023 Mac Pro. Frankly, I agree with your statement here. I'm sure that the 2023 Mac Pro WILL serve plenty of people just as well and I don't doubt that the 2019 Mac Pro did and does as well. But that doesn't negate the fact that these were never the most optimal machines for GPU related tasks.

You keep saying "optimal" without defining optimal. The 2019 Mac Pro was a good choice for a lot of GPU workloads.

Was it a good choice for bitcoin mining or AI training? No. But there's a bigger world beyond that.

You'd hope that a 4-GPU configuration from 2019 would still clobber a single-GPU configuration from 2023. That's not saying a whole lot.

I'm saying Apple's 2019 top end can still clobber Apple's 2023 top end. That's a problem. The 2023 Mac Pro got slower - not faster.

It's not really that the top end config got cheaper. It's that Apple cut the top end config. Literally. The top end config was suppose to be M2 Extreme which never shipped. The top end config is missing.

If M2 Extreme had shipped, I think we'd be having a different version of this conversation - maybe around upgradability. But looking at the numbers they would have had top tier GPU performance.
 
Last edited:

Yebubbleman

macrumors 603
May 20, 2010
6,024
2,616
Los Angeles, CA
The Mac Pro’s goal is to replace Windows workstations. Not supplement them.

...what? This makes my brain hurt.

The Mac Pro was literally a Xeon class workstation that could run Windows as a secondary operating system. It was, tier for tier, component to component an equal to a PC workstation. And if you really weren't convinced you could load Windows onto the exact same hardware and run it as a Windows workstation.

Please note the above in bold. These are your original goalposts. These are the points that I've been debating here.


Again, you're taking a multi dimensional choice and folding it down to a single dimension: GPU benchmarks.

No. I'm saying that if GPU performance is the top priority (over the device in question being a Mac), then the Mac Pro isn't the best choice and never really was. If the device in question being a Mac is the top priority (over the device having performant graphics), then that's an entirely different point to be made. I don't understand what's so hard to grasp about this nor why you feel compelled to keep debating it like it isn't an obvious fact.

I only keep belaboring this because you keep fighting me on it. Your original point, again as quoted again above is that these Macs are tier for tier, component for component, comparable and competitive and they're not. Incidentally, this is a thread full of people realizing that Apple has abandoned them, neverminding the fact that the writing for this was on the wall for years. I'm not saying that makes it any better, nor that anyone in this boat doesn't have my sympathy. Y'all definitely have my sympathy for what little that's even worth. But acting like it's a surprise or like it couldn't have been predicted demonstrates a lack of attention to the obvious.

There are a lot of reasons people bought the 2019 Mac Pro for GPU work. Heck - Radeons even sell on the PC side. Saying the Mac Pro was a bad choice because it had Radeons continues to be nonsense.

Incidentally, I never said it was a bad choice. Just that there are substantially better choices IF MAXIMIZING GPU PERFORMANCE IS ONE'S TOP PRIORITY IN A WORKSTATION COMPUTER.

Are there reasons someone might choose a GeForce based computer? Yes. But that doesn't make the Mac Pro a bad choice.

You keep putting words in my mouth without buying me dinner first and I'm not very appreciative. I never said that it was a "bad" choice. I said that it's not a "competitive" or "optimal" choice. If you want to turn that into "bad", then that's on you.


You keep saying this - and as I keep pointing out - competitive does not mean it's the best in every performance axis.


Again, when you use verbiage like "tier for tier" and "component to component equal", that would directly imply that it's at least comparable in every performance axis.

There was even a lot of utility in having upgradable GPUs year over year.

Let's make one thing CRYSTAL clear: I'm not arguing against upgradability. What I am arguing is that the Mac Pro pales in comparison to pretty much any other workstation tower when it comes to upgradeability and that's far from limited to GPU selection.

A lot of users didn't care about raw performance but needed a machine where they could keep up with new versions of Metal.

...And if one NEEDS Metal, then fine. But, Metal still pales in comparison to CUDA which goes back to my original point that if raw GPU performance is more important than the device being a Mac, the Mac Pro is the wrong device. But if those priorities are flipped, that's irrelevant. A point that, again, I have no idea why you keep debating when it's otherwise obvious.

I'm kind of scratching my head here. That's both true and misses the point. A vast majority of CPUs sold are Intel. Does that mean AMD is non competitive?

AMD offers performance on par with (and in some cases exceeding) that of Intel. That makes them competitive.

AMD does not offer performance on par with NVIDIA and neither does Apple. That's the difference.

Hah. Yes. I'm intimately familiar with everything in that video.

Clearly not! There are things in that video that you are arguing against like that video wasn't straight from the horse's mouth! You are a Mac user, you are on a Mac forum. Like it or not, Apple makes the rules! I'm not saying I like it (for what it's worth, I don't like it any more than anyone else in this or any of the dozen of other anti-2023-Mac-Pro threads on here). But hell if I'm going to deny that when that's obviously how the Apple ecosystem works. They decide how things are going to go, and it's up to us to either take it or leave it. I'm not saying I like or condone it (again, I really don't), but denying that is to deny reality at this point.

Sure. You can go back to my posts discussing this on this forum in 2020 when that video was posted.

It's becoming hard enough to justify the expenditure of time to debate you here; I'm not exactly certain where there's added justification to dig up three year old posts of yours.

If not I'll re-summarize:
- That architecture technically does not rule out swappable GPUs. (Shared address space can work across removable GPUs.)

Technologically no. But Apple has stated that this is not what they're ever going to do. That is not their system architecture; they are not going to open things up to either (a) third party GPUs or (b) discrete GPUs. That is not how they've structured Apple Silicon as a hardware platform. That is deliberate. That is not going to change without a substantial announcement being made at a WWDC (and even that is an extremely unlikely course correction [seriously, you will see a flying pig outside your window before they change course on this]). So, your point of "well, they could do it" is utterly moot.

- That video does not rule out a competitive GPU (say like a rumored M2 Extreme that seems like it was cancelled.)

That video does not rule out a higher-end SoC. That much is true. You may very well get one in the M3 generation. Or, what's way more likely, the PCIe ecosystem on the Mac will be relegated to Thunderbolt breakout boxes, PCIe storage, and high-speed networking interfaces, while those for whom this Mac Pro (and M2 Ultra, by extension) doesn't suit (which, statistically is likely not even the vast majority of Mac Pro customers), will either gravitate toward the Mac Studio or some other workstation. I'm hoping against hope for the former, but I'm not fooling myself about how utterly unrealistic it is given Apple's obvious priorities and lack thereof in the workstation space.

I don't think your "facts" are quite what you think they are.

At this point, I have no choice but to take your word for it.


they could have worked on removable GPUs that still had a unified address space. They didn't.

Please explain how that's even remotely possible given Apple's stated design objectives. Because I'm fairly sure that it isn't, but I don't want to make the assumption that you don't have any sort of background in SoC design.

Apple Silicon is not incompatible with these things. Even within the guardrails of what they outlined in the 2020 video there are still options.

M2 Extreme is the only option you stated that was actually a POSSIBLE option, given Apple's stated goals and the laws of physics. And clearly, Apple made a calculation as to whether or not that would be worth doing and decided it wasn't. I'm not going to say that they made this calculation correctly (as I don't believe that, myself). But that's clearly how the cookie crumbled.

I don't think this goalpost has moved at all.

See above.

I mentioned that you could even put the same Nvidia cards in a Mac Pro and you freaked out.

Hyperbole much?

Apple even mentions installed Nvidia cards in a tech doc here:

I actually didn't know that they supported that. That's actually very cool. Thanks for sharing that (seriously). Doesn't change the point I'm making nor my original counter argument to your "component to component" rhetoric. But it's rad that Apple would at least allow NVIDIA on there in that capacity.

You keep saying "optimal" without defining optimal.

"Bang for buck" or "Best tool for the job".

The 2019 Mac Pro was a good choice for a lot of GPU workloads.

If those workloads HAD to be Metal or the work HAD to be done on macOS, then sure. Otherwise, it's a terrible value proposition.

Was it a good choice for bitcoin mining or AI training? No. But there's a bigger world beyond that.

If that's all you think an NVIDIA card or Windows Workstation is good for then this debate is truly pointless.

I'm saying Apple's 2019 top end can still clobber Apple's 2023 top end. That's a problem. The 2023 Mac Pro got slower - not faster.

It sounds like this is not the Mac for you, then. Maybe buy something else? A 2019 Mac Pro, if that one suited your needs so well. Or, dare I say it, a different workstation that isn't hindered by Apple Silicon?

Incidentally, the comparison I was replying to was that of four AMD GPUs compared to a single NVIDIA GPU, not M2 Ultra compared to anything.

It's not really that the top end config got cheaper. It's that Apple cut the top end config. Literally. The top end config was suppose to be M2 Extreme which never shipped. The top end config is missing.

If M2 Extreme had shipped, I think we'd be having a different version of this conversation - maybe around upgradability. But looking at the numbers they would have had top tier GPU performance.
I get that. It's undeniable that upgradeability and customization got shafted with the 2023 Mac Pro and that the higher-end configurations were ultimately eliminated. It is undeniably a bummer. Truly. But this is a point that is now two months old. It is no longer novel. And there's nothing any of us can do about it other than figure out what to buy instead. The OP started this thread on the pretense of the 2023 Mac Pro being the end of the line. Belaboring what we've already established in the other dozen threads is a waste of time and energy.
 
  • Like
Reactions: mode11

jimmy_john

macrumors member
Jun 28, 2023
74
109
I'm saying Apple's 2019 top end can still clobber Apple's 2023 top end. That's a problem. The 2023 Mac Pro got slower - not faster.

I switched from a 2019 Mac Pro to an M1 Ultra Studio and then to an M2 Ultra Mac Pro because after a week+ of side-by-side testing in each case, the upgrade was faster.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
I switched from a 2019 Mac Pro to an M1 Ultra Studio and then to an M2 Ultra Mac Pro because after a week+ of side-by-side testing in each case, the upgrade was faster.
For your workflows.

Note @goMac says can clobber. Not always will in every workflow. That’s the fundamental problem, the new top end is not universally faster than the old top end.

This has never happened for the stock configs Apple has offered for the Mac Pro.
 
  • Like
Reactions: ZombiePhysicist

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
For your workflows.

Note @goMac says can clobber. Not always will in every workflow. That’s the fundamental problem, the new top end is not universally faster than the old top end.

This has never happened for the stock configs Apple has offered for the Mac Pro.

It’s slower and way slower for many workflows. eg 3d. Think how pathetic that is when you’re comparing it to a 4 year old machine. Not sure if that is more pathetic or the “speed” people are “marveling” at for their M2ultra workflows can be outpaced by a laptop level i9, much less obliterated by many cheap AMD offerings.
 

jimmy_john

macrumors member
Jun 28, 2023
74
109
For your workflows.

Note @goMac says can clobber. Not always will in every workflow. That’s the fundamental problem, the new top end is not universally faster than the old top end.

This has never happened for the stock configs Apple has offered for the Mac Pro.

Careful man, if you suggest that the M2 Ultra is not a complete and total dumpster fire in every measurable way compared to the obsolete-upon-arrival 2019 you may get banned.
 

jimmy_john

macrumors member
Jun 28, 2023
74
109
It’s slower and way slower for many workflows. eg 3d. Think how pathetic that is when you’re comparing it to a 4 year old machine. Not sure if that is more pathetic or the “speed” people are “marveling” at for their M2ultra workflows can be outpaced by a laptop level i9, much less obliterated by many cheap AMD offerings.

Why would anyone use any Mac for 3d? madness
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Note @goMac says can clobber. Not always will in every workflow.

Yep. I specifically said can.

I think, for the most part, anything GPU centric is not going to see the 2023 Mac Pro come out the winner. Especially if it's multi GPU compatible.

But - Apple did load M2 up with a lot of transcoding units. And the CPU performance is a big jump over a >4 year old Xeon. So if you're doing a lot of video editing or Final Cut - probably a pretty great machine.

Is it worth buying over a Mac Studio for those use cases? Probably not unless you really need those PCIe cards.
 
  • Like
Reactions: ZombiePhysicist

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
No. I'm saying that if GPU performance is the top priority (over the device in question being a Mac), then the Mac Pro isn't the best choice and never really was. If the device in question being a Mac is the top priority (over the device having performant graphics), then that's an entirely different point to be made. I don't understand what's so hard to grasp about this nor why you feel compelled to keep debating it like it isn't an obvious fact.

Again - you're going down a single performance avenue.

If you needed the absolute best single GPU performance and you were not bound to macOS - sure, maybe it made sense to buy a PC.

If you had a macOS based workflow already, you might be willing to take the performance hit. A lot of people have *NIX workflows and didn't want to make the jump to Linux. Apple for a long time dominated the professional *NIX workstation market.

If you were doing something like Redshift or Octane that was multi GPU aware - the Mac Pro seemed like a pretty good choice.

I only keep belaboring this because you keep fighting me on it. Your original point, again as quoted again above is that these Macs are tier for tier, component for component, comparable and competitive and they're not.

You're not really making a point here besides insisting it's true, and then arguing with people when they try to explain their use case and telling everyone they're wrong. I'm trying to explain to you why the Mac Pro was a workable, competitive solution for a lot of folks and you're just telling me "no."

Incidentally, I never said it was a bad choice. Just that there are substantially better choices IF MAXIMIZING GPU PERFORMANCE IS ONE'S TOP PRIORITY IN A WORKSTATION COMPUTER.

Gee, it's like I've repeatedly said now over a few posts you're using one metric to simplify something down unrealistically. I've never said a W6900X is the fastest GPU on the market. I've just said there are reasons beyond pure-down-to-the-benchmark-single-GPU-performance one might pick a Mac Pro.

Most of those reasons are harder to hold up with the 2023 now that it's even slow.

You keep putting words in my mouth without buying me dinner first and I'm not very appreciative. I never said that it was a "bad" choice. I said that it's not a "competitive" or "optimal" choice. If you want to turn that into "bad", then that's on you.

Again optimal for what? You keep saying "optimal" and "competitive" and then when I outline the scenarios in which a 2019 was actually a sensible choice you just keep typing "optimal" and "competitive."

Again, when you use verbiage like "tier for tier" and "component to component equal", that would directly imply that it's at least comparable in every performance axis.

Yes - it was component to component comparable to a Xeon workstation because it was a Xeon workstation.

It wasn't just comparable. It was an x86 workstation.

Let's make one thing CRYSTAL clear: I'm not arguing against upgradability. What I am arguing is that the Mac Pro pales in comparison to pretty much any other workstation tower when it comes to upgradeability and that's far from limited to GPU selection.

I think everyone would have liked more GPU selection for macOS. I think people would have liked Nvidia GPU drivers for macOS.

There is a difference between that and "don't buy!" and "not competitive!" Every machine has trade offs. I could even pick at the things a Dell Precision does really badly.

...And if one NEEDS Metal, then fine. But, Metal still pales in comparison to CUDA which goes back to my original point that if raw GPU performance is more important than the device being a Mac, the Mac Pro is the wrong device.

Explain. I'm very interested in the technical details of why Metal pales in comparison to CUDA.

AMD offers performance on par with (and in some cases exceeding) that of Intel. That makes them competitive.

You used sales benchmarks as your definition of competitive. Not me.

AMD does not offer performance on par with NVIDIA and neither does Apple. That's the difference.

Again, I think you're confusing "competitive" and "the very best." AMD is behind Nvidia, but it's not like 50% or 80% behind.

Clearly not! There are things in that video that you are arguing against like that video wasn't straight from the horse's mouth! You are a Mac user, you are on a Mac forum. Like it or not, Apple makes the rules!

I suppose we'll just have Tim Cook write everyone's posts from now on in since no one is allowed to question Apple or have an opinion.

It's becoming hard enough to justify the expenditure of time to debate you here; I'm not exactly certain where there's added justification to dig up three year old posts of yours.

I just figured in since you know me and know what videos I have seen as a "fact" then clearly you're aware of my post history.

Technologically no. But Apple has stated that this is not what they're ever going to do. That is not their system architecture; they are not going to open things up to either (a) third party GPUs or (b) discrete GPUs. That is not how they've structured Apple Silicon as a hardware platform. That is deliberate. That is not going to change without a substantial announcement being made at a WWDC (and even that is an extremely unlikely course correction [seriously, you will see a flying pig outside your window before they change course on this]). So, your point of "well, they could do it" is utterly moot.

So - third party GPUs, probably not. Apple Silicon is TBDR which is patent encumbered. You can't get a Radeon or GeForce card with TBDR which is a hold up. And Apple has effectively guaranteed TBDR rendering on Apple Silicon Macs. Metal and the architecture on Apple Silicon can actually still support non-TBDR third party GPUs if there were drivers. But in since Apple has told developers to code assuming TBDR exists that horse has already left the barn.

Discrete? It is possible to do an Apple Silicon GPU model with discrete GPUs. A discrete GPU can do TBDR. A discrete GPU can do a shared memory space. (AMD GPUs can do a shared memory space even though they are discrete.)

I think there is a meta conversation about the technical complexity and performance loss over having discrete GPUs with a unified memory space. But it's not impossible - like I said, AMD has done it. That may be more complexity than Apple wants to invest in a Mac Pro. But it's not completely implausible.

That video does not rule out a higher-end SoC. That much is true. You may very well get one in the M3 generation. Or, what's way more likely, the PCIe ecosystem on the Mac will be relegated to Thunderbolt breakout boxes, PCIe storage, and high-speed networking interfaces, while those for whom this Mac Pro (and M2 Ultra, by extension) doesn't suit (which, statistically is likely not even the vast majority of Mac Pro customers), will either gravitate toward the Mac Studio or some other workstation. I'm hoping against hope for the former, but I'm not fooling myself about how utterly unrealistic it is given Apple's obvious priorities and lack thereof in the workstation space.

As I said above - does not have to be. PCIe Apple Silicon GPUs or some sort of custom slot is not completely implausible and is technically possible. While maintaining all promises Apple has made about how Apple Silicon works with compatibility.

Do I think it's likely at this point when they couldn't even get M2 Extreme out the door? No. But if AMD can do it Apple can do it.

Please explain how that's even remotely possible given Apple's stated design objectives. Because I'm fairly sure that it isn't, but I don't want to make the assumption that you don't have any sort of background in SoC design.

As I explained above - it is.

Apple didn't invent this architecture. Plenty of other companies, such as AMD and Nvidia, have played with the same concepts including a discrete card model. My experience is mostly with Apple Silicon - so I don't want to wade too much into an AMD stack that I don't have experience with. But they've been working on things like Infinity Fabric to support this model.

M2 Extreme is the only option you stated that was actually a POSSIBLE option, given Apple's stated goals and the laws of physics..

No. Easiest option - yes. Apple may not want to dabble in a complicated, expensive custom architecture for the Mac Pro.

If those workloads HAD to be Metal or the work HAD to be done on macOS, then sure. Otherwise, it's a terrible value proposition.

Again - the Mac Pro being a great *NIX workstation was it's own value.

I think Apple will begin more steadily losing that to Linux now (which started a while ago) - but macOS held out for a while as a UNIX workstation with plenty of commercial app support.

If that's all you think an NVIDIA card or Windows Workstation is good for then this debate is truly pointless.

Nope. I have a Windows workstation! But I also have a Mac Pro. There's a reason I have both.

It sounds like this is not the Mac for you, then. Maybe buy something else? A 2019 Mac Pro, if that one suited your needs so well. Or, dare I say it, a different workstation that isn't hindered by Apple Silicon?

Way ahead of you.

I get that. It's undeniable that upgradeability and customization got shafted with the 2023 Mac Pro and that the higher-end configurations were ultimately eliminated. It is undeniably a bummer. Truly. But this is a point that is now two months old. It is no longer novel. And there's nothing any of us can do about it other than figure out what to buy instead. The OP started this thread on the pretense of the 2023 Mac Pro being the end of the line. Belaboring what we've already established in the other dozen threads is a waste of time and energy.

I don't think OP is wrong. M3 Extreme could happen but - realistically the Mac Pro offers very niche reasons to exist over the Mac Studio. Apple has alienated everyone again by pushing MPX and then pulling it back. And the performance for the price point is way, way worse than the 2019.

If you don't like talking about the Mac Pro being the end of the line - you don't have to be here. But what's happening right now is the forum is (mostly) full of people who love buying Mac Pros and are scratching their heads at this one. We're the customer base and we're not sure what is going on here.
 
Last edited:

chfilm

macrumors 68040
Nov 15, 2012
3,425
2,109
Berlin
Yep. I specifically said can.

I think, for the most part, anything GPU centric is not going to see the 2023 Mac Pro come out the winner. Especially if it's multi GPU compatible.

But - Apple did load M2 up with a lot of transcoding units. And the CPU performance is a big jump over a >4 year old Xeon. So if you're doing a lot of video editing or Final Cut - probably a pretty great machine.

Is it worth buying over a Mac Studio for those use cases? Probably not unless you really need those PCIe cards.
The m2 ultra Mac Studio/pro beats my 2019 Mac Pro with Two Vega IIs by a 30-35% margin in GPU heavy tasks like denoising or retiming effects in daVinci resolve..
 

vsc

macrumors member
May 8, 2014
74
33
It's Apple Hubris for them to think they could surpass the performance of chipmakers who have been in the game decades longer. Instead we got below-par performance with the added "feature" of non-upgradability.
I'm not disagreeing but the x86 instruction set architecture only survives because we are able to throw mind boggling number of transistors at getting the CPI into the superscalar region who was sold on RISC architectures back from the work at Berkeley, I'm rather disappointed that implementations remain focused largely on portable / mobile computing (e.g. Apple's M silicon).
 

Yebubbleman

macrumors 603
May 20, 2010
6,024
2,616
Los Angeles, CA
You're not really making a point here besides insisting it's true, and then arguing with people when they try to explain their use case and telling everyone they're wrong. I'm trying to explain to you why the Mac Pro was a workable, competitive solution for a lot of folks and you're just telling me "no."

Actually, that's not what's happening here at all. You're using words like "competitive" and "component to component" incorrectly to justify an argument that is wrong.


Yes - it was component to component comparable to a Xeon workstation because it was a Xeon workstation.

Incidentally, there are other components than the processor. The notion that the processor is the only thing that makes it component to component comparable proves that you are using "component to component" and "comparable" in ways that are counter to what those words actually mean. Also that you don't understand hardware not made by Apple.

It wasn't just comparable. It was an x86 workstation.


Incidentally, x86 workstations come in all sorts of shapes and sizes and with MANY DIFFERENT KINDS OF COMPONENTS!

There is a difference between that and "don't buy!" and "not competitive!"

There you go putting words in my mouth again without my consent...

Every machine has trade offs. I could even pick at the things a Dell Precision does really badly.

It can't run macOS. Past that, and it's just another Xeon workstation that is feature for feature, component to component competitive with and comparable to the Mac Pro. At least, using your definitions of those words. ;)


Explain. I'm very interested in the technical details of why Metal pales in comparison to CUDA.

No you're not. That's textbook sarcasm that you're using to try to expose a weakness in my argument to deflect from the several weaknesses in yours. Not playing ball.

You used sales benchmarks as your definition of competitive. Not me.

Incidentally, specs and feature differences are the basis for "competitive" unless one is to use whatever your definition du jour for it is.

Again, I think you're confusing "competitive" and "the very best." AMD is behind Nvidia, but it's not like 50% or 80% behind.

"Competitive" at least when used correctly, implies that the two things being compared are at least within the same ballpark in terms of features and/or value.

AMD is behind NVIDIA enough that it is not seen as a viable replacement in many spaces (the workstation space being among them).

Similarly, Circa December 2019 to May 2023, the only people picking up Mac Pros were people that required macOS in their workloads.

Otherwise, NVIDIA graphics were better and Dell and HP provided substantially better options with better upgradeability at an ultimately better value than the Mac Pro did.

The only people eying Mac Pros were Mac users or UNIX users who didn't want to deal with Linux. No one wanting to buy a "Xeon Workstation" to run Windows or Linux was buying a Mac Pro unless macOS was a priority. It stands to reason that if the Mac Pro was "component to component" "feature for feature" identical and also "competitive" when compared to those other workstations, that Windows users who don't need macOS would've purchased the Mac Pro to run Windows primarily, if not exclusively.

Again, I do not know what about that compels you to argue or debate.

I suppose we'll just have Tim Cook write everyone's posts from now on in since no one is allowed to question Apple or have an opinion.

You are allowed to question Apple and repeatedly broadcast your opinion (however informed or uninformed it is) and I'm allowed to point out the futility in doing so. These are not mutually exclusive.


I just figured in since you know me and know what videos I have seen as a "fact" then clearly you're aware of my post history.

That's quite the stretch...

So - third party GPUs, probably not. Apple Silicon is TBDR which is patent encumbered. You can't get a Radeon or GeForce card with TBDR which is a hold up. And Apple has effectively guaranteed TBDR rendering on Apple Silicon Macs. Metal and the architecture on Apple Silicon can actually still support non-TBDR third party GPUs if there were drivers. But in since Apple has told developers to code assuming TBDR exists that horse has already left the barn.

Discrete? It is possible to do an Apple Silicon GPU model with discrete GPUs. A discrete GPU can do TBDR. A discrete GPU can do a shared memory space. (AMD GPUs can do a shared memory space even though they are discrete.)

I think there is a meta conversation about the technical complexity and performance loss over having discrete GPUs with a unified memory space. But it's not impossible - like I said, AMD has done it. That may be more complexity than Apple wants to invest in a Mac Pro. But it's not completely implausible.



As I said above - does not have to be. PCIe Apple Silicon GPUs or some sort of custom slot is not completely implausible and is technically possible.


Whether it's technically plausible or not is 10000% irrelevant. I don't know what about that you're having such issues with. Yes, it's ABSOLUTELY technically plausible. There are several things that Apple can implement that are technically plausible that Apple has zero intention of ever allowing to happen. Here are a few examples:

- The ability to downgrade the operating system on an iPhone or iPad to an earlier version
- Being able to install macOS on non-Apple branded hardware
- Allowing for third party GPUs on Apple Silicon Mac hardware
- Allowing for GPUs on Apple Silicon hardware that are not integrated with the SoC

So, yeah, totally plausible. But never going to happen, thereby rendering the technological plausibility entirely moot.

While maintaining all promises Apple has made about how Apple Silicon works with compatibility.

Wrong again. Apple was explicitly clear about the graphics being in the SoC. If they are on a card, then they're not on an SoC. Similarly, Apple is explicit about the shared memory pool ALSO being on the SoC. If you remove these things from the SoC, you introduce latency. Apple was explicit about Apple Silicon removing this exact kind of latency from the equation. And you'd have known all of this if you had watched the videos you claimed you had watched.

Do I think it's likely at this point when they couldn't even get M2 Extreme out the door? No. But if AMD can do it Apple can do it.



As I explained above - it is.

Apple didn't invent this architecture. Plenty of other companies, such as AMD and Nvidia, have played with the same concepts including a discrete card model. My experience is mostly with Apple Silicon - so I don't want to wade too much into an AMD stack that I don't have experience with. But they've been working on things like Infinity Fabric to support this model.


Your two sentences in bold completely contradict each other. My advice is to just stop talking about stuff you're not knowledgable in. There's nothing wrong with not knowing something. But perpetuating debate over it is needless.

If you don't like talking about the Mac Pro being the end of the line - you don't have to be here.

I have no problem talking about the Mac Pro being the end of the line. However, that's not the debate you have me locked into. The debate you have me locked into revolve around your circular, inconsistent, and utterly incorrect uses of the words "competitive" "feature for feature" and "component to component".

I'd MUCH rather talk about what people are planning on doing if this is, in fact, the end of the line. Whether or not it actually is the end of the line. And what someone shopping for a workstation is likely buying instead of this 2023 model. That's not what you have me engaged in.

But what's happening right now is the forum is (mostly) full of people who love buying Mac Pros and are scratching their heads at this one. We're the customer base and we're not sure what is going on here.
Incidentally, I'd much rather be entertaining that discussion rather than your ill-informed notions of what the 2019 Mac Pro was relative to anything else on the market.
 

jimmy_john

macrumors member
Jun 28, 2023
74
109
Again, I think you're confusing "competitive" and "the very best." AMD is behind Nvidia, but it's not like 50% or 80% behind.

There are many workloads where they are more than 80% behind or are not even an option.

GPU-based 3D rendering? Nvidia is the only game in town unless you’re willing to suffer through Redshit or Octane out of some personal jihad against CUDA.
 
  • Like
Reactions: Yebubbleman
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.