Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Perhaps they should just give their customers what they want. If they need any hints, they can just visit the Puget website. The complication lies in finding an overlap (if any) between what their customers want and what Apple would like to make.



Presumably it would still be ARM based though? You're talking about a massive ARM CPU, with loads of PCIe lanes for GPUs?





Therein lies the problem.



Well technically, I had already made both posts before you replied to the first ;)



But you're a self-admitted Apple super-fan. You're certainly not the only one, but are there enough to make the Mac Pro a product Apple is particularly interested in making? The evidence would suggest not. The iPad was on M4 whilst the Mac Pro was (and still is) on M2.



I don't have a 2019, but I think the MP should have upgradable GPUs and RAM, and sensibly priced SSD modules. But those all seem to be dealbreakers for Apple. It should have considerable headroom over a Studio Ultra, otherwise what's the point?



The PC workstation market gets Threadripper and Xeon platform development for 'free', as these CPUs are sold in their hundreds of thousands for server use. Apple doesn't make server hardware, so it would be like financing Threadripper development out of their own pocket, for however many Mac Pro units they sell. They'd have to be convinced that technological supremacy at the high end casts a worthwhile halo over the whole Mac range. Otherwise, they'd be better off doing something more profitable, like releasing Space Grey AirPods.



Quite. Where's the Extreme?



The 2019 was supposed to be the chosen one, to bring trust to the pro Mac market. Now they need to regain the trust they only just regained? It's too much drama. Easier to just use Windows, and only consider a switch back once Apple have demonstrated several generations of sustained interest in tower computing.
You're talking about a massive ARM CPU, with loads of PCIe lanes for GPUs...


YEP. 100% Also, the last hail mary about this whole thing...you brought up a very valid point, if they were convinced that technological surpriacy at the high end through a worthwhile halo over the whole Mac range. My thought is...if they build it, they will come LOL. Seriously...everyone in the industry already loves and uses Apple for everything other than 3D animation and VFX...if they build a machine that was even on par with the 5090, I'm 100% convinced it would sell through the roof. But as you said, it's Apple that needs to be convinced.
 
Yeah but Apple's definition of "jaw drop incredible" is often mediocre and completely not what the people who want that sort of thing want.

Exhibit A: Apple Vision Pro.
You know what's wild about the AVP? It's actually an awesome system. In fact, it's the best headset for VR and AR hands down...however...THEY NERFED TF out of it by not having a physical control system. I get they wanted to revolutionize how we can control these things via the body...but controllers should've STILL be an option for gaming. I'm hoping they learned their lesson about this as I know they are currently developing the next gen of the AVP and supposedly it will have controllers of some sort...
 
It's a bit ironic considering Apple's founding. The whole point of having an Apple II on your desk is that you could run and write your own software and use the full power of your computer without needing to pay for time-sharing on a remote mainframe. Cloud computing has its place and will possibly become more important in the future, but pushing such a thing for normal workstation (let alone consumer) use cases strikes me as a giant step backwards. I don't want to pay a subscription for things I can and should be able to run on my own computers. Also for latency sensitive use cases, such a thing would not even make sense. If Apple went this route, I would look to migrate my workflow to some variant of Linux.



The Vision Pro seems akin to the OpenDoc in that there is clearly some neat and interesting technology there, but it's not clear what problem the tech is/was trying to solve.


If I remember, there was a rumor that the next Ultra chip was going to be a monolithic design rather than 2x Mx Max chips glued together. I wonder if they will probably make use chiplet/tiling/die-stacking tech to accomplish this and then will use an Ultra-Fusion interconnect to connect to another die mainly consisting of additional GPU cores. Like take the design of the M4 Max, remove the GPU cores and replace that with additional CPU performance cores, display controllers, and while adding ungodly PCIe bandwidth. Then the GPU would be a pee-configured tiling of 64, 128, 192, or 256 GPU cores. SOC RAM would go to 256 GB, but I would include a pool of regular RAM slots that could be expanded as high as 4TB and could function as either a hidden volume for SWAP files or function as main memory if the user desires more memory capacity over wider bandwidth and reduced latency of SOC RAM.
I absolutely love this idea and would buy it in a heartbeat.
 
the Mac Pro is in a weird spot.

For the workloads people use Macs for it just isn't really required, and for the workloads a high end machine with multiple TB of RAM and SOCs on cards (as I think they should do. the mainboard just links multiple M4 max cards together into one system - but I digress...) - they need to target a specific market with this hardware to justify it I think because video editing just isn't it any more. Can do that on a MacBook Air these days and the rendering is done on render farms.
100%, in both the config you are pitching and also needing to figure out who to target and how to target them. For me, an ad with just the Mac Pro in an infinite white room sitting on a desk with a monitor. Cut to a closeup of the monitor running Crysis on ultra settings, then cutting to a white screen that just says "Yep". Then the all new Mac Pro. Instant buy from me. This could be a series. In the next one, just show about 100 AAA games running in Ultra resolutions at 100+ fps this time with quick cuts and the focus stays on the settings in the top left corner. For a third ad in the series I would show Unreal Engine 5 with a massively heavy scene running in realtime like it was nothing, OR show Octane or Redshift running a massively heavy scene in realtime. All of these would end with the white screen that says "Yep" then cutting to the final screen saying "The all new Mac Pro.".

Instant purchase for anybody that's in the target audience.
 
  • Like
Reactions: throAU and Boil
I keep getting bans for making political comments outside of the politics forum, so I'll just say the EU are wrong on many levels for their intervention.

Apologies in advance to the mods if I bust the rules yet again...it is IMPOSSIBLE to talk about BS EU intervention in private business without being political, because that move is itself political.

I think Apple are in their own way working towards their own GPU tech. nVidia, as good as they are, aren't the only game in town, and as far as I can see, Apple are only just getting started. The M-series has been the "can we do this?" test-run. I think there is much more coming in the decade ahead.

Intel are floundering; AMD is current king of the PC desktop, and nVidia are getting stuck into AI compute.

nVidia said recently that the PC gaming market is not their focus, either, and the 50xx series cards are going to be very expensive. Intel are trying, but will not really make a success of their second attempt at breaking into the GPU market for the same reasons their CPUs are no longer the best.
I agree with a lot of this. However, Nvidia announced officially the RTX 5 series the other day and...the prices are absolutely fair. in fact, the bottom tier is only $580 or something like that and it's supposedly equivalent to a 4090, while the top tier I think was $2500 which again, is fair.

But I definitely agree and believe Apple has been experimenting and building their own GPU solution for the better part of a decade at this point and I truly believe they will NOT announce a new Mac Pro until it is something that will be a full blown marvel to everyone. When they finally do it, it's going to be a game changer.
 
  • Like
Reactions: Boil
However, Nvidia announced officially the RTX 5 series the other day and...the prices are absolutely fair. in fact, the bottom tier is only $580 or something like that and it's supposedly equivalent to a 4090, while the top tier I think was $2500 which again, is fair.

The 5090RTX is very hard to ignore. It's around AUD$4000 (compared with AUD$9000 for the W6900X MPX or AUD$7500 for the 6800 Duo MPX) so it's very well priced for what is on offer.

It makes a strong case to move away from Apple. I'm already scoping out building a machine myself.
 
  • Love
Reactions: maikerukun
The Vision Pro seems akin to the OpenDoc in that there is clearly some neat and interesting technology there, but it's not clear what problem the tech is/was trying to solve.

It has always been one of Apple's weaknesses that they almost instinctively refuse to solve a problem the way the customer wants it solved. Or, that they're disinterested in the customer's actual problem, but think it can be used as a pretext to advance something similar to (but not really) a solution, which Apple actually wants to make.
 
You know what's wild about the AVP? It's actually an awesome system. In fact, it's the best headset for VR and AR hands down...however...THEY NERFED TF out of it by not having a physical control system. I get they wanted to revolutionize how we can control these things via the body...but controllers should've STILL be an option for gaming. I'm hoping they learned their lesson about this as I know they are currently developing the next gen of the AVP and supposedly it will have controllers of some sort...

Yeah you see I don't buy that "best headset" thing, when you have companies like Somnium doing headsets with higher resolution, better passthrough cameras, better hand tracking, eye tracking, steamvr lighthouse tracking, made in the EU rather than China, and able to use a proper GPU, rather then cement-shoed to an obsolete, elderly pensioner GPU like the iPad-level M2.

The things Apple prioritises with the AVP are not the things people who *need* headsets for doing inherently three dimensional tasks require, and for anyone who doesn't have inherently three dimensional tasks to do, *any* headset is a pointless burden.
 
100%, in both the config you are pitching and also needing to figure out who to target and how to target them. For me, an ad with just the Mac Pro in an infinite white room sitting on a desk with a monitor. Cut to a closeup of the monitor running Crysis on ultra settings, then cutting to a white screen that just says "Yep". Then the all new Mac Pro. Instant buy from me.

lol. Is Crysis still the benchmark? It probably runs on an iPad these days. I’d have thought Cyberpunk at 4K with full path tracing would be a sterner test.

This could be a series. In the next one, just show about 100 AAA games running in Ultra resolutions at 100+ fps this time with quick cuts and the focus stays on the settings in the top left corner.

Macs have never been sold as games machines. I suppose reaching 100 AAA games on the Mac would be a milestone worth celebrating.

For a third ad in the series I would show Unreal Engine 5 with a massively heavy scene running in realtime like it was nothing, OR show Octane or Redshift running a massively heavy scene in realtime. All of these would end with the white screen that says "Yep" then cutting to the final screen saying "The all new Mac Pro.".

This is more like it. But not if it’s twice the price of an equivalent PC. MacOS isn’t that much of a draw.
 
The 5090RTX is very hard to ignore. It's around AUD$4000 (compared with AUD$9000 for the W6900X MPX or AUD$7500 for the 6800 Duo MPX) so it's very well priced for what is on offer.

It makes a strong case to move away from Apple. I'm already scoping out building a machine myself.
You know me fam, I'm as loyal as it gets for Apple and we all know what i'm currently working on...At this point, if Apple loses all of us, that's of their own doing...
 
It has always been one of Apple's weaknesses that they almost instinctively refuse to solve a problem the way the customer wants it solved. Or, that they're disinterested in the customer's actual problem, but think it can be used as a pretext to advance something similar to (but not really) a solution, which Apple actually wants to make.
Facts. Sorry for the late reply. We were evacuated for the fires and it was a fairly stressful week. But yep, I agree with this.
 
Yeah you see I don't buy that "best headset" thing, when you have companies like Somnium doing headsets with higher resolution, better passthrough cameras, better hand tracking, eye tracking, steamvr lighthouse tracking, made in the EU rather than China, and able to use a proper GPU, rather then cement-shoed to an obsolete, elderly pensioner GPU like the iPad-level M2.

The things Apple prioritises with the AVP are not the things people who *need* headsets for doing inherently three dimensional tasks require, and for anyone who doesn't have inherently three dimensional tasks to do, *any* headset is a pointless burden.
To be fair, I don't think anybody NEEDS any of anything outside of for work lol. But I get your point. However, if Somnium headsets can't be 100% integrated into the Apple Ecosystem, then it's a no go for me. On the lower end, sub $500 headsets, sure, but anything above that needs to be integrated with Apple for me.
 
lol. Is Crysis still the benchmark? It probably runs on an iPad these days. I’d have thought Cyberpunk at 4K with full path tracing would be a sterner test.



Macs have never been sold as games machines. I suppose reaching 100 AAA games on the Mac would be a milestone worth celebrating.



This is more like it. But not if it’s twice the price of an equivalent PC. MacOS isn’t that much of a draw.
I only said Crysis because it would be a funny throwback inside joke for people. It would cause that "I know that reference" feeling amongst people :p

Yep, exactly. Showing a few frames of literally about 100 AAA games running on the Mac would definitely let the world know they've arrived to be taken seriously in gaming.

True...however, it would absolutely keep their core audience around instead of having us throw money at Puget Systems lol.
 
  • Like
Reactions: mode11
To be fair, I don't think anybody NEEDS any of anything outside of for work lol. But I get your point. However, if Somnium headsets can't be 100% integrated into the Apple Ecosystem, then it's a no go for me. On the lower end, sub $500 headsets, sure, but anything above that needs to be integrated with Apple for me.

Point being Apple is charging "Professional" mac money, for an iPad with novelty goggle UI. The AVP should have been Quest-priced - should have been iPad priced, because nothing it does in terms of immersive, three dimensionality is "professional Mac" quality - lacking SteamVR, precision tracking, and better-than-iPad graphics. It's a novelty way to work with Apple's 2D iCloud-enabled apps, not a 3D content authoring peripheral.

But Apple is unlikely to ever bring headset computing to the Mac beyond remote screens, because when you see how VR is used in the real world, the OS platform is meaningless, because the apps furnish the user experience. There's nothing for Apple to value-add to VR. We saw what happened last time - Apple had SteamVR, and HTC headsts, and they used it for... spherical video.

BTW hope you and your peeps are safe with the fires.
 
  • Like
Reactions: maikerukun
Point being Apple is charging "Professional" mac money, for an iPad with novelty goggle UI. The AVP should have been Quest-priced - should have been iPad priced, because nothing it does in terms of immersive, three dimensionality is "professional Mac" quality - lacking SteamVR, precision tracking, and better-than-iPad graphics. It's a novelty way to work with Apple's 2D iCloud-enabled apps, not a 3D content authoring peripheral.

But Apple is unlikely to ever bring headset computing to the Mac beyond remote screens, because when you see how VR is used in the real world, the OS platform is meaningless, because the apps furnish the user experience. There's nothing for Apple to value-add to VR. We saw what happened last time - Apple had SteamVR, and HTC headsts, and they used it for... spherical video.

BTW hope you and your peeps are safe with the fires.
I heard ya, and understand your points. Which is why I think we will see that lower price point with the annoucenemtn of AVP2 or AVPmini or whatever they will end up calling it. My understanding is they're gunning for sub $1k? That's what I heard anyway.

As for the fires, yeah we are safe now. I lost my Palisades office, but my home here in the valley is still standing and we are back in our neighborhood and all of us are safe now. Powers back on and life is back to normal "as normal as it can be after everything". Thank you.
 
  • Like
Reactions: AdamBuker
As for the fires, yeah we are safe now. I lost my Palisades office, but my home here in the valley is still standing and we are back in our neighborhood and all of us are safe now. Powers back on and life is back to normal "as normal as it can be after everything". Thank you.

Sorry to hear about your office, but glad you and yours are in one piece. Those fires were crazy.
 
  • Love
Reactions: maikerukun
The 5090RTX is very hard to ignore. It's around AUD$4000 (compared with AUD$9000 for the W6900X MPX or AUD$7500 for the 6800 Duo MPX) so it's very well priced for what is on offer.

It makes a strong case to move away from Apple. I'm already scoping out building a machine myself.

Not really. It depends on the problem set - the workload you’re running.

The 5080 is great for workloads that fit into its VRAM. Like games.

Larger than that? Performance will tank and even an M2 Ultra with sufficient unified memory will slaughter it. Nvidia do make high end cards but go price check some of their datacenter/engineering workstation cards.

This is partially why the Mac Pro is in a wierd spot right now. Those workloads exist, and Apple could go after them but it isn’t really their market yet (?).
 
  • Love
Reactions: maikerukun
Not really. It depends on the problem set - the workload you’re running.

The 5080 is great for workloads that fit into its VRAM. Like games.

Larger than that? Performance will tank and even an M2 Ultra with sufficient unified memory will slaughter it. Nvidia do make high end cards but go price check some of their datacenter/engineering workstation cards.

This is partially why the Mac Pro is in a wierd spot right now. Those workloads exist, and Apple could go after them but it isn’t really their market yet (?).
Do you think there's a world where it becomes their market?
 
The 5080 is great for workloads that fit into its VRAM. Like games.

Larger than that? Performance will tank and even an M2 Ultra with sufficient unified memory will slaughter it.

Will it, though? I recall it being noted a while back that the M2 Ultra's memory bandwidth *effectively* limits it to 32GB of VRAM. So the M2u is a markedly slower GPU, with effectively no more memory.

Like most things Apple, the efficiency gains that are the tradeoffs for the flexibility sacrifices of integrating things never materialise by the time the "older" paradigm has iterated to be better than the (stalled) new, or by the time the "new" has been obsolesced by the "even newer".
 
Will it, though? I recall it being noted a while back that the M2 Ultra's memory bandwidth *effectively* limits it to 32GB of VRAM. So the M2u is a markedly slower GPU, with effectively no more memory.
Replace M2 Ultra with m4 max then. My 64 GB Max can use up to 48GB as VRAM. A 128 GB machine can use 96 GB or 112 GB I think?

Either way, the point is that the performance of those cards is workload dependent, just like the unified memory advantages in what apple is putting out.

If you need a GPU with access to more than 32 GB of VRAM right now, you can get it in say, a 64 or 128 GB MacBook Pro M4 Max.

Or you go for something like an Nvidia H200 at $32,000 US per card.

Yeah the H100/H200 will slaughter the MacBook, but again... it depends what you need and what you price point is.

The unified memory architecture in the M series has some real world scenarios where it is unbeatable (price to performance, not outright performance of course) right now (e.g., running BIG LLMs on an affordable machine). But they're not really Apple's traditional markets at the moment.

I mean let's say you need 80GB of VRAM. Your options right now are an M3 max, M4 Max or something like this (random hit from google, no affiliation, etc.):


Then of course you need to build a machine to put it in....
 
Last edited:
  • Love
Reactions: maikerukun
The 5080 is great for workloads that fit into its VRAM. Like games.

Avro was talking about the 5090. 32GB is good for more than playing games.

Larger than that? Performance will tank and even an M2 Ultra with sufficient unified memory will slaughter it.

How useful is 192GB of RAM when paired with a ~RTX3080-level GPU though? And aren't most deep learning tools optimised for / require Nvidia anyway?
 
Avro was talking about the 5090. 32GB is good for more than playing games.



How useful is 192GB of RAM when paired with a ~RTX3080-level GPU though? And aren't most deep learning tools optimised for / require Nvidia anyway?
I mistyped 5090.

Everything I mentioned above applies to both anyway.

With regards to the RAM vs. GPU - if the model you're working on won't fit into VRAM then it doesn't matter how powerful the GPU is. And no, there are metal optimised runtimes/models.

It's not just LLMs either.
 
  • Love
Reactions: maikerukun
With regards to the RAM vs. GPU - if the model you're working on won't fit into VRAM then it doesn't matter how powerful the GPU is.

Presumably there are models that don't fit into 192GB (or whatever's available after OS, app use etc.) either. Is there some mechanism / approach for dealing with models that exceed VRAM? And how would the GPU approach compare to using e.g. an EPYC with 2TB of RAM for this type of work?

Genuine questions - don't claim to know anything about ML.
 
  • Like
Reactions: maikerukun
Is there some mechanism / approach for dealing with models that exceed VRAM?
Swap, just like anything else, but it is an order of magnitude slower, so normally not practical.

For the Mac, with unified memory (or any other machine with unified memory) swap is easier.

To swap out GPU memory, that's more of a multi-stage process that needs to go over the PCIe bus, into system RAM and then get swapped out - which takes time to copy to system ram, time to swap, etc. Swapping in is also read into system ram and then copy over bus to GPU.

The Mac just reads/writes into system memory which is unified - done.

High, high end datacenter GPUs are probably (not sure these days) using large amounts of HBM as cache for large amounts of system memory (like, terabytes), but ultimately people will end up doing unified memory - or there's been talk of moving the processors into the RAM (or vice versa, Apple for example effectively moved both RAM and storage very close to the processor).

Nvidia have even put out papers to the effect of "compute is almost free (vs transfer cost), data transfer is expensive". This is why there is so much focus on GPU memory bandwidth. But the flip side to that is to cut down the amount of moving data around you need to do (e.g. unified memory) - and why apple are making all the unified memory VRAM(ish) speed by using 2-4x as many memory channels as a PC. Which is another reason the RAM is on package - routing that memory memory channels is both space consuming and expensive. Memory bus trace length is also now impacting the ability to clock memory at higher speeds.

This is why m4 pro/max run memory at DDR5-8333 at 4/8 channels for example - and why you won't find anything in PC land right now doing that in a laptop.

EPYC datacenter CPUs will do 8 channel memory, but they won't do it anywhere near 8333 speeds.
 
Last edited:
  • Love
Reactions: maikerukun
Larger than that? Performance will tank and even an M2 Ultra with sufficient unified memory will slaughter it. Nvidia do make high end cards but go price check some of their datacenter/engineering workstation cards.

This is partially why the Mac Pro is in a wierd spot right now. Those workloads exist, and Apple could go after them but it isn’t really their market yet (?).

 
  • Like
Reactions: maikerukun
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.