Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Joe The Dragon

macrumors 65816
Jul 26, 2006
1,031
524
NVIDIA are now making CPUs, and they sure aren't x86s for running legacy workflows:


There's two ways of looking at this: it's what Apple could do with Apple Silicon if they felt like investing a fortune on new silicon and breaking in to the data centre market (the parallels with Apple Silicon are obvious...)... but they're not going to be able to compete at that level by super-gluing four M2 Pros together.

The question is, where does a $10k-$50k personal super-workstation like the 2019 Mac Pro fit in the marketplace when a $3k laptop can do what your Mac Pro was doing a few years ago and, if you need more, you can just rent an order of magnitude more computing power in the cloud?
just aslong they drop the storage being raid 0 only with no hot swapping.
and need to add an ipmi (that can also do DFU mode) (needs to have the full image in it as well)
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
No Apple laptop, nor any cloud soution can do today, what a Workstation desktop can do today (or even what a Gaming PC can do, and could do years ago) - drive a high definition, high frame rate XR headset.

Apple GPUs have be n designed with XR in mind. They have realtime framebuffer compression, adaptive rasterization, and state of the art sparse texturing support. In combination these features can massively reduce the amount of work and memory bandwidth required to drive a high-res headset.
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
Apple GPUs have be n designed with XR in mind. They have realtime framebuffer compression, adaptive rasterization, and state of the art sparse texturing support. In combination these features can massively reduce the amount of work and memory bandwidth required to drive a high-res headset.

Prepping for WWDC, reading Snow Crash & watching Johnny Mnemonic...

Hey, y'all think Apple might push more over-priced utilitarian products (Apple Cloth) at us with the release of the headset...?

Apple iDrops, amiright...? ;^p
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Apple GPUs have be n designed with XR in mind. They have realtime framebuffer compression, adaptive rasterization, and state of the art sparse texturing support. In combination these features can massively reduce the amount of work and memory bandwidth required to drive a high-res headset.

And they can't drive competitive high resolution 3D at high frame rates on one display, let alone two simultaneously.
 
  • Like
Reactions: prefuse07 and GianL

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
That's because Apple silicon/macOS/Metal has been designed from day one with fully-synced driving of two displays; right now it "stumbles" with no second synced display to render out to; that is what is making ASi GPU stuff appear "slow"...

TrUe StOrY, bRuV... ;^p
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
Less than a week to go...

Will we see, at the minimum, a preview of the ASi Mac Pro...?

Will we get just the M2 Ultra, or might we get 3nm M3 Ultra/Extreme variants...?

With hardware ray-tracing... ;^p

Will we get something different (Think Different) for the ASi Mac Pro; maybe a collection of chiplets like AMD, maybe a SuperChip like Nvidia, who knows...?

Or maybe, just maybe, we will get nothing at all...

Keynote ends with Tim Cook kicking a can across the stage...
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
No Apple laptop, nor any cloud soution can do today, what a Workstation desktop can do today (or even what a Gaming PC can do, and could do years ago) - drive a high definition, high frame rate XR headset.
It depends on what you mean by "high definition", "high frame rate" and "years ago". Current iPhones and iPads can very much do what gaming PCs were doing "years ago" and they're not getting worse. Sure, you can do better with a workstation, but the money and potential growth for a company like Apple would come from fixing that and making something that consumers will want to use day-to-day. Which means cloud-driven and in an appliance the size of a phone.

VR tethered to a desktop workstation isn't going anywhere - figuratively or literally. It's a niche market which NVIDIA, AMD and Intel have pretty much sewn up, and even Apple's existing Intel/AMD Mac Pro isn't exactly the system of choice.

Maybe Apple will pull something out of their hat - possibly along the lines of those NVIDIA compute modules shown in the video - but finding some way to put a M2 Ultra in a tower and have it drive a 4090 (or AMD's equivalent) is unlikely to impress.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Maybe Apple will pull something out of their hat - possibly along the lines of those NVIDIA compute modules shown in the video - but finding some way to put a M2 Ultra in a tower and have it drive a 4090 (or AMD's equivalent) is unlikely to impress.

Apple's main challenge in respect to the GPU is the chip area. The M2 Max is roughly the same size as the AD102 (chip powering the 4090), but Nvidia can pack 4x as many GPU compute clusters on it. To overcome this disadvantage, Apple either needs to glue multiple dies together or resort to some other trickery, none of which come cheap.

And this becomes particularly interesting when we look at Nvidia's Hopper. This is pretty much the same paradigm as Apple Silicon, only they use heterogenous RAM for the CPU and GPU dies plus a fast interconnect, and of course, everything is bigger. The GPU component is essentially a modification of AD102 and features a similar amount of SMs. Now for the fun part: a single Grace Hopper board will probably cost well over $100k. I mean, the carrier board alone is $200k.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
Will we see, at the minimum, a preview of the ASi Mac Pro...?

If not, I think that would just about wrap it up for the Mac Pro, which has already been in limbo for 3 years. These things aren't impulse buys - customers have to build them into plans and project proposals.

Will we get something different (Think Different) for the ASi Mac Pro; maybe a collection of chiplets like AMD, maybe a SuperChip like Nvidia, who knows...?
Well, NVDIA, Amazon, Ampere etc. are all working on ARM-based superchips for the data centre and Apple tends not to make "Me Too" products (well, apart from the watch). They opted out of the data centre years ago when they dropped the XServe.

A good product for Apple would put the Mx Max/Ultra that they already have to good use.

Or maybe, just maybe, we will get nothing at all...
If the M2 Ultra Mac Studio appears then it might as well be the new Mac Pro even if they don't call it that. As a successor to the Trashcan Mac Pro it ticks most of the boxes. Or, they could make a 1U rack version of the Studio (so you could rack it up with storage and PCIe enclosures) and call that the Mac Pro.

I wouldn't rule out the rumored 2019-style box with a M2 Ultra and a few PCIe slots - but I think it would be a damp squib, and I suspect that leakers have just been seeing a test platform for new processors.

OK, wild mad guess time: I'm gonna go with Mx Ultra "compute units" packaged as MPX-like cards plugging into a Cheesegrater-like chassis & clustering over a PCIe bus. Possibly even backwards-compatible with the 2019 MP.
 
  • Like
Reactions: Adult80HD

leman

macrumors Core
Oct 14, 2008
19,521
19,678
OK, wild mad guess time: I'm gonna go with Mx Ultra "compute units" packaged as MPX-like cards plugging into a Cheesegrater-like chassis & clustering over a PCIe bus.

This has been my favorite concept for quite some time now. Would fit very well into Apple Silicon paradigm, allow expandability (even RAM potentially!), and it would be a very Apple-like solution.

The main problem with this: programming model. NUMA is notoriously difficult and requires extra effort from the software side.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
Now for the fun part: a single Grace Hopper board will probably cost well over $100k. I mean, the carrier board alone is $200k.
Sure, but they're designed for the data centre where they'll serve dozens of users, if not cloud service providers where they'll bring in money 24/7.

It's not like you get that many single-user Mac Pros for half a million.

...when you've got wide area network infrastructure that can deliver multiple streams of 4k video to domestic users, good enough latency that people are already gaming over it, and a workforce that is increasingly demanding to be able to work from home, the days of $50k single-user workstations are probably coming to an end.
 
  • Like
Reactions: novagamer

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Sure, but they're designed for the data centre where they'll serve dozens of users, if not cloud service providers where they'll bring in money 24/7.

It's not like you get that many single-user Mac Pros for half a million.

Exactly, my comment was more about the cost of this type of technology and the resulting viability for the consumer market.
 

novagamer

macrumors regular
May 13, 2006
234
315
Sure, but they're designed for the data centre where they'll serve dozens of users, if not cloud service providers where they'll bring in money 24/7.

It's not like you get that many single-user Mac Pros for half a million.

...when you've got wide area network infrastructure that can deliver multiple streams of 4k video to domestic users, good enough latency that people are already gaming over it, and a workforce that is increasingly demanding to be able to work from home, the days of $50k single-user workstations are probably coming to an end.
Popping back in quick to say that I agree with this, and in fact would go even further - ~5 years from now I doubt any home user will be able to buy a modern compute-capable (as in, near the state-of-the-art) card for their normal-ish PC. nVidia is very clearly pivoting to compute-on-demand, if you watch their long presentation about "digital twin" this is obvious IMO.

They will still sell gaming cards but they're going to be incredibly neutered, much like how fp64 has been for the past 3-4 generations on nVidia and past 2 generations on AMD.

Ironically I think Apple could, if they choose to, be the only remaining entrant in the "do it all yourself" HPC local-cloud world in 5+ years, since it's going to make a lot more money and sense for everyone else to rent that functionality out, plus gather valuable data on the non-government accounts using their tooling. If they do this I think it'll take a long time since Apple is slow to identify these niches and will probably wait for the industry to leave a gaping hole before they possibly decide to fill it.

...

The Mac Pro comments continue to astound me. Your 2019 Mac Pro is not getting an Apple Silicon card, Christ. Whoever mentioned the Mac Pro coming off leases is correct, companies that purchase $40k workstations write them down over 3 years and are already into their next cycle of replacements and many are not even using Macs at this point.

The people saying there's no point to an M2 Ultra Mac Pro - not true, Audio pros want a Mac with PCIe slots. I think it would be an enormous mistake to release an M2 Mac Pro of any flavor, but stranger things have happened. They may do the pre-announcement of an M3 Mac Pro, and release M3 MBAs this summer, then Macbook Pros in the fall (remember the M2 Max MBP was 3-4 months late), then Mac Pro toward the end of the year or early 2024. This is the best case I thnk any of us can hope for.

I also would bet 50/50 that we do get AMD GPUs with the Mac Pro since Apple Silicon cannot keep up with the highest-end, and if that VR headset supports tethering optionally it will be a good use case, especially for developers.

If the M3 turns out to be late then I'd expect an M2 Ultra Mac Studio at WWDC, and no Mac Pro until 2024 which would be fine with me.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
just aslong they drop the storage being raid 0 only with no hot swapping.


Errr, are you referring to Mac Pro 2019 having two SSD modules ( not drivers; modules). Apple is doing NOTHING there that every SSD that has write times about as short as read times is doing. That isn't 'RAID 0'... that is basically how modern fast SSD drives work. It is going to be ONLY that. It is ONLY that on mainstream WindowsPC workstation boxes also that boot off a SSD.

Those modules are NOT drives. They are subcomponents of a single drive. You cannot have RAID 0 ( in any meaningful sense of the term ) with a single drive.

and need to add an ipmi (that can also do DFU mode) (needs to have the full image in it as well)

A second processor so a remote user on a computer can blow up the whole security chain of the host computer with the user not knowing? Probably not coming.

Does Apple need better MDM in the Mac space. Yes. Is it going to be old school IMPI tools ? No.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Ironically I think Apple could, if they choose to, be the only remaining entrant in the "do it all yourself" HPC local-cloud world in 5+ years, since it's going to make a lot more money and sense for everyone else to rent that functionality out, plus gather valuable data on the non-government accounts using their tooling. If they do this I think it'll take a long time since Apple is slow to identify these niches and will probably wait for the industry to leave a gaping hole before they possibly decide to fill it.

I think this is the strategy Apple has been following for some time now. I don't know whether they'd will want to target HPC in the narrow sense (that's a very special market after all), but Apple Silicon has a strong focus on local computation.

They will still sell gaming cards but they're going to be incredibly neutered, much like how fp64 has been for the past 3-4 generations on nVidia and past 2 generations on AMD.

FP64 on GPUs is disappearing simply because it's a bad investment of die area. I'd rather have FP32 only SIMD units, maybe with some auxiliary instructions that help me implement faster extended precision if I need it. Will be faster than the 1/64 hardware FP64 unit in modern GPUs anyway.

If you want to look for an example where gaming cards are gimped, well, that's memory interfaces. We now have narrow, very highly clocked RAM that uses too much power and can't provide sufficient bandwidth for many larger compute problems. For graphics it's ok, since it has a lot of spatial locality and can be served from caches.
 
  • Like
Reactions: novagamer

leman

macrumors Core
Oct 14, 2008
19,521
19,678
That isn't 'RAID 0'... that is basically how modern fast SSD drives work.

I think it's a useful analogy though that helps to understand that SSDs improve speed by data striping. RAID0 is often used synonymous to striping, which is of course not technically correct, but you know...

Of course, to claim that use of this technology makes Apple less reliable than other SSDs that do exactly the same thing is a bit... weird. So here I fully agree with you.
 

innerproduct

macrumors regular
Jun 21, 2021
222
353
Is it out of the question that Apple might have been fiddling around with things closer to what nvidia and other HPC actors been doing with ARM? Even Amazon and Meta are creating their own chips now so to me it wouldn't be that far fetched for Apple to also make an apple silicon version for workstations and servers. No rumours though...
 
  • Like
Reactions: dgdosen

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Or it could be used to solve their problem of increasing manufacturing costs on cutting edge nodes. They could put compute clusters on 3nm while keeping cache/display controller/memory controllers on 5nm. This application is heavily hinted in the patent.

But you are right that I initially mistook it for a way to connect two symmetric dies (like two Max-chips). Apple is explicitly talking about splitting SoC functionality across dies.

The other related documents links on the graphic you linked has a first link to


" High bandwidth die to die interconnect with package area reduction "

The 'with package area reduction" is really the major point of this patent more so than the "high bandwidth die to die interconnect" is. The new-ish part here is really the 'vertical interposer ' here. That is really to get the major die--over-overlap being leveraged here it shrink the size of the overall package.

Package size reduction really isn't a major problem for MBP 16" , Studio , or Mac Pro ( or large screen iMac if they bring that back). I doubt Apple wants to grow the footprint of the Max and Ultra much bigger. Putting more into the space they have is more important. This patent doesn't do much of anything to 'solve' the size problem an 'Extreme' would have. It 'buys' a package size reduction by adding a thermal constraint. [ back to painting themselves into a thermal corner like the Mac Pro 2013 with extra thermal coupling being 'cute'. ]

For the XR headset they may have some package size constraints. The iPhone and limited size iPads have them ( if Apple goes 'crazy' and does a 14" iPad then getting into laptop logicboard size land. )

This could show up first on the XR headset, but likely not coming for most of the M-series. Longer term Splitting the dies yes. As I said in other threads, Apple needs to actually do a good chiplet design (not 'twins' of the same thing.. an actual chiplet that does a good function decomposition) . The extra long side trip through the vertical interposer for higher end high performance computing ? Probably not.
 

novagamer

macrumors regular
May 13, 2006
234
315
I think this is the strategy Apple has been following for some time now. I don't know whether they'd will want to target HPC in the narrow sense (that's a very special market after all), but Apple Silicon has a strong focus on local computation.

Agreed, it is a strong differentiator. I may be off by a couple years considering Apple have contributed to more open source projects lately that benefit from this, but I don't consider them fully having adopted this approach until their own tooling supports it, which right now is not really the case vs. other IDEs etc. Maybe in a year or two, or they'll shock us with some massively revamped XCode at WWDC this year (I doubt it).

FP64 on GPUs is disappearing simply because it's a bad investment of die area. I'd rather have FP32 only SIMD units, maybe with some auxiliary instructions that help me implement faster extended precision if I need it. Will be faster than the 1/64 hardware FP64 unit in modern GPUs anyway.
FP64 has been artificially limited on many cards throughout the last few years at the hardware / firmware level, not for die reasons. Radeon VII was the last card with any reasonable FP64 performance for a decent price, and for a while they were selling for a lot of money due to this (not just crypto) but then people discovered – as I did – how much ROCm completely blows.

For the newest GPUs it might be a design consideration but I was using this more as an example of how the GPU Compute industry is stratifying away from the "Workstation under your desk" to compute server you connect to and "Gamer PC that can do some much slower compute" under your desk. They have done that artificially in the past through hardware/software lockouts and may be now designing the cards in this way, definitely.

It's why I'm extremely skeptical the PC Market is ever going to get another consumer Titan card, which sucks, because I liked them.

If you want to look for an example where gaming cards are gimped, well, that's memory interfaces. We now have narrow, very highly clocked RAM that uses too much power and can't provide sufficient bandwidth for many larger compute problems. For graphics it's ok, since it has a lot of spatial locality and can be served from caches.
Agreed 100% with this, good points.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Package size reduction really isn't a major problem for MBP 16" , Studio , or Mac Pro ( or large screen iMac if they bring that back). I doubt Apple wants to grow the footprint of the Max and Ultra much bigger. Putting more into the space they have is more important. This patent doesn't do much of anything to 'solve' the size problem an 'Extreme' would have. It 'buys' a package size reduction by adding a thermal constraint. [ back to painting themselves into a thermal corner like the Mac Pro 2013 with extra thermal coupling being 'cute'. ]

However, it might soon become one if they want to improve performance. The dies are already getting very large. And they have to deal with bad density scaling for SRAM. Breaking the SoC into, say, a higher-density compute die and a lower-density memory/aux die could be one way to continue increasing compute cluster sizes without incurring high manufacturing cost. And package size reduction kind of goes hand in hand with this — as you say, these chips are already big, 2D stacking would make them even bigger.

And yes, you are right, this is solving a different problem than an alleged Extra. I see this more of a way to continue making fast base SoCs. Whether the thermal constraint you mention is going to be substantial remains to be seen (I would guess that SRAM generates less heat than logic, but no idea). At any rate, Apple has a lot of headroom there. Other companies ship same die sizes that have to dissipate almost 10x as much power after all :)

FP64 has been artificially limited on many cards throughout the last few years at the hardware / firmware level, not for die reasons. Radeon VII was the last card with any reasonable FP64 performance for a decent price, and for a while they were selling for a lot of money due to this (not just crypto) but then people discovered – as I did – how much ROCm completely blows.

For the newest GPUs it might be a design consideration but I was using this more as an example of how the GPU Compute industry is stratifying away from the "Workstation under your desk" to compute server you connect to and "Gamer PC that can do some much slower compute" under your desk. They have done that artificially in the past through hardware/software lockouts and may be now designing the cards in this way, definitely.

Good point. Yes, for a while things like FP64 and wireframe rendering were artificially limited as a differentiator between "consumer" and "pro" cards. But recent GPUs indeed dropped almost all FP64 hardware, usually packing one or two ALUs in the "special unit", if at all. With the desire to extract more FP32 compute and the asynchronous SMT program execution on the GPU, supporting high performance FP64 becomes an expensive goal as it complicates instruction scheduling...
 
  • Like
Reactions: novagamer

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
Is it out of the question that Apple might have been fiddling around with things closer to what nvidia and other HPC actors been doing with ARM? Even Amazon and Meta are creating their own chips now so to me it wouldn't be that far fetched for Apple to also make an apple silicon version for workstations and servers. No rumours though...

This is the question; assuming we get an ASi Mac Pro, what form will the CPU/GPU come in...?
  • AMD-style CPU & GPU chiplets with a larger (process) I/O die/memory controllers/whatnot...?
  • Nvidia-style SuperChip...?
  • Two Mn Ultra daughtercards on a proprietary high-speed backplane...?
  • Mn Extreme (quad SoC configuration) on new interface/packaging technology...?
I like the SuperChip, because it is similar to the asymmetrical SoC concept; one "regular" Mn Max SoC UltraFusioned to a "GPU-specific" SoC; for a CPU-cores to GPU-cores ratio that favors the GPU-core count...

Now take that and have two on the aforementioned high-speed proprietary backplane in a Mac Pro Cube...! ;^p
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Fair enough, though if the intention was to remind us that an ASi Mac Pro is in the works, why say "we believe strongly that Apple Silicon can power and transform experiences from the MacBook Air to all the way up to the Mac Studio"? Why not openly say 'up to the Mac Pro'? Mentioning it would reveal nothing about its configuration or release date.

Because Apple's standard corporate policy is to not talk about future products. When Apple said "Mac Pro later" they really were not talking about the Mac Pro at all. There is no descriptive adjective on the Mac Pro there. 'later' more so refers to some extremely nebulous even than to the 'Mac Pro'.

The Mac Pro providing some 'transform experience' is an attempt to add characteristic features to the Mac Pro. The whole point to is point off to some nebulous event off into the future. What "power and transform" activity going on is reserved for products he can actually talk about.

He would also be opening the door for the reporter to ask follow up questions about the Mac Pro trying to pull even small tidbits about the Mac Pro out trying to walk this guy right up to the very edge of what could be said ( which is basically nothing substantial). By leaving the Mac Pro out , he is basically directing the follow ups into what he can talk about.

If the reporter wanted to discuss how the MP 2019 had changed from 2019 to 2021 with the addition of W6800X MPX modules ... he could have covered that because it is a released product.

When , and if, does a 'sneak peak' on the Mac Pro it will be a highly crafted , highly scripted event where Apple just about completely drives the discussion content. Not some 'road show' in India.

It was also a statement about the potential of Apple Silicon - "we believe". It wasn't a comment about the products they already sell, e.g. "Apple Silicon is doing a great job, from the MacBook Air to the Mac Studio".

The potential for someone who hasn't bought one yet. They are selling Macs here in this interview. That's their primary job. If it doesn't exist openly as a product you cannot buy it. If can't buy it 'potential' isn't really something to talk about.

He really isn't talking about M3, M4, and/or M5 series potential here either. And again those are future products.



Or did he just misspeak, and I'm reading too much into it (quite possible)?

He is trying not to say something that could get him fired. He is putting a healthy boundary around a topic he doesn't want to discuss.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.