Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Disappointed with Mac Pro 2023?


  • Total voters
    534

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
And internal SATA.
Doesn't the internal SATA seem odd? Where's the SATA controller? It seems hard to believe that M2 Ultra would include a SATA controller when I don't think any Apple Silicon product has ever had one? Are they using a x1 PCI-E lane for a discrete SATA controller?

(And yes, the spec sheet does say SATA)
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Doesn't the internal SATA seem odd? Where's the SATA controller? It seems hard to believe that M2 Ultra would include a SATA controller when I don't think any Apple Silicon product has ever had one? Are they using a x1 PCI-E lane for a discrete SATA controller?

(And yes, the spec sheet does say SATA)

Most likely a separate controller somewhere, fed though the same switch as the rest of the PCIe slots.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
This. Building a PC, or using a tower at all is exculsively the realm of gamers. And even they hate the state of the hardware business as it dies off to low volume. By 2030 even most gaming "towers" will be based on SoCs, there will be no way to make a system with discrete RAM, CPU and GPU competitive with what a single high bandwidth, low latency package can do. It's not even that iGPUs will catch up to dGPU performance, it's that the concept of a dGPU will hit a performance ceiling and only putting the GPU, CPU and RAM on package will allow the envelope to be pushed further. No matter your budget in time, money and space you could not build a modern PC out of vacuum tubes. Or standalone transistors. Or simple ICs. And soon, there will be no way a GPU or RAM module connected by a hundred or so pins over a ~12" copper trace can compete with what can be done by putting everything nanometers apart. RAM stacked directly over the die, CPU and GPU cores separated only by nm scale memory controllers, etc. This is always been coming. Hardware always evolves into more tightly integrated components as the miniaturization of manufacturing allows.
What you describe is beauty.

SoC is for the win.

x86 desktop towers and modularization will still have a market as large and as relevant as mainframes.

Windows 11 on ARM laptops are coming from Qualcomm at the end of the year and other Android SoC that ship more than a billion smartphone SoC annually. They will be more attractive to consumers than any x86 PC laptop much less desktop.

The reason why Apple has the efficiency edge is mainly due to Apple having an edge in terms of the node they use, the PDN (power delivery network) tech, and packaging.


Their ARM cores are actually more complex than the x86 competitors; significantly wider and with larger resources for out of order and speculation. Most people assume there is some kind of "magic" that makes ARM better that x86, but that is not the case. The ISA has little impact on overall power consumption given the same microarchitectural resources.


Apple uses their larger/more complex cores to their advantage, by running them at a slower clock rate. While allowing them to do more work per clock cycle. This allows them to operate on the frequency/power sweet spot for their process. One has to note that power consumption increases significantly (way higher than linear) the higher the frequency.


Here is where the PDN technology comes into play. Apple uses the most advanced technology to distribute power to keep all the functional units feed, which requires the ability to supply a lot of instantaneous power. To do so, Apple uses a 3D stacked architecture of 2 dies; one for the logic, and another one on top (or bottom depending where you look at it) to distribute the power. In contrast, almost every one else has to use the same die to do logic and distribute power.


The irony is that a simpler/smaller ARM core would have to be clocked faster in order to compete with Intel/AMD cores. And it would end up consuming the same high power.


Apple also has a very good SoC design. Meaning that they integrate most of the system on a single die; the CPUs, the GPU, the NPU (AI accelerator), the Codec (video processing), the camera block, I/O (USB, WiFi, ethernet, PCIe/TB, etc), and the memory controller.


For some stuff like AI and video encoding, having custom silicon handling it is far far more efficient than running it on a general purpose code.


Lastly, it also comes to packaging. Apple not only integrates the SoC in a single die, but it has the memory chips on the same package. This allows them to use low power mobile DDR chips, and since they are on package it also reduces significantly all the power that having the memory transactions run through the system's PCB externally would consume.


So it's a combination of Apple using a single package where Intel/AMD laptops require multiple through their PCBs to support the same functionality. As well as Apple having access to better overall fabrication technology for that single package that AMD/Intel have for theirs.

The trend seems to be that it is becoming more efficient for mobile vendors to scale up their products into laptops, than it is for desktop vendors to scale down their products into laptops.


There is also a key difference in business models: Apple is a system's vendor. Meaning that they sell the finished product, not just the processors. So they can use several parts from the vertical process to subsidize others. In this case, Apple can afford to make very good SoCs because they don't sell those chips elsewhere, meaning that they are not as pressured to make them "cheap" in terms of area for example. Since they're going to recoup the profit from elsewhere in the product.


In contrast; AMD and Intel sell their processors to OEMs, so they only get profit from the processor not the finished system. So they have to prioritize cost, by optimizing their designs for Area first and then focus on power. This is why both AMD and Intel use smaller cores, which allows them for smaller dies. But which have to be clocked faster in order to compete in performance, unfortunately that also increases power.
This is probably they ke difference; Apple can afford the larger design that is more power efficient for the same performance. Whereas AMD/Intel have to aim for the smaller design that is less power efficient for the same performance.
 
Last edited:

Longplays

Suspended
May 30, 2023
1,308
1,158
most people use laptops, or laptop class hardware for their primary machine in all but a few industries. apple silicon is great for laptops so apple is well positioned there ( though likely competition will start using these "mobile phone" processors in their laptops down the road and eventually hit sufficiently competitive speeds.)

but i agree that this is evidence that apple didn't have anything up their sleeve as far as a typical workstation class machine (with lots of ram and a top end performing and upgradable gpu.)

They pieced something together for the 2023 mac pro but the paradigm for apple hardware will be generally limited to what SoC can do (with SoC type ram modules) all the way up and down.
We should hold out any verdict until 3rd party benchmarks are out.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
The Mac Pro got all of 2 minutes 20 seconds in Apple's over 2 hours long WWDC event on Monday. That pretty much says it all about what this Mac Pro is to Apple.
Comes across like an "atom" of attention to me.

Optimistically at most 0.1% of all Mac units sold annually.
 
  • Like
Reactions: AlphaCentauri

yurc

macrumors 6502a
Aug 12, 2016
835
1,014
inside your DSDT
some preliminary benchmark is just dropped.


Based on my guesstime from my blender usage perspective, even with fully spec'ed Ultra GPU with Metal renderer...still no bueno against NVDIA/optix renderer, even consumer grade RTX 4000 series cards smoked M2 Ultra. Ballparking the M2 Ultra GPU power, probably sitting around with AMD RX7900XT / HIP which around 3600 median score (blender open data benchmark)
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Windows 11 on ARM laptops are coming from Qualcomm and other Android SoC that ship more than a billion smartphone SoC annually. They will be more attractive to consumers than any x86 PC laptop much less desktop.
Not sure I see that - why, as a consumer, do I want an ARM Windows 11 laptop?

If the PC industry actively tries to replace x86 with ARM by essentially removing x86 from Worst Buy and offering only/mostly ARM, I think they are looking at another Vista-type debacle or much worse. A big part of the reason for Vista's poor reception is that after 7 years of NT 5.x and 5 years of XP, people had forgotten just how shaky drivers can be on a new operating system. Doesn't help that you also had the 32-bit to 64-bit transition in the middle, and plenty of vendors did not make 64-bit drivers for things.

It's now been, oh, about 16 years since Microsoft did something that seriously broke driver compatibility. AFAIK, pretty much every device that works on 64-bit Vista will work fine on 64-bit Windows 10/11 (newest feature updates). So no one has had to seriously think about these issues for 15 years.

That won't be true if suddenly the 2024 iteration of Dell's Inspiron 16 is ARM-based where the previous 10 Inspiron 15/16s were x86.

And that's just drivers and that's assuming that you have Apple-quality x86 emulation on the software front. Do you believe that Microsoft will somehow ship 'good enough' x86 emulation, especially when they have 3 decades of 32-bit x86 Windows stuff their emulation needs to be compatible with? Apple had the benefit of Carbon being dead, 32-bit x86 being dead, etc when they designed Rosetta 2... not to mention their experience with PPC to Intel.

Windows on ARM makes sense as a Chromebook-type thing, but if you want a networked device with no local peripherals to run web and Electron apps on, why do you want/need Windows?
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
some preliminary benchmark is just dropped.


I find it very strange to compare a $7K Mac Pro against Intel and AMD CPUs that alone cost $6k. Of course they are going to be faster. Why not take a 32-core Threadripper or a 28-core Xeon which are much closer to M2 Ultra price-wise?

I think the GB5 s ores of M2 Ultra will end up around 32K once we have enough benchmarks.

Based on my guesstime from my blender usage perspective, even with fully spec'ed Ultra GPU with Metal renderer...still no bueno against NVDIA/optix renderer, even consumer grade RTX 4000 series cards smoked M2 Ultra. Ballparking the M2 Ultra GPU power, probably sitting around with AMD RX7900XT / HIP which around 3600 median score (blender open data benchmark)

I agree with your estimate. It should be roughly comparable to RX7900XT/3090 CUDA, and will obviously be outclassed by Optix renderers. But it has an advantage in the amount of GPU RAM. For complex scenes it will likely outperform any Nvidia consumer GPU.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Not sure I see that - why, as a consumer, do I want an ARM Windows 11 laptop?

If the PC industry actively tries to replace x86 with ARM by essentially removing x86 from Worst Buy and offering only/mostly ARM, I think they are looking at another Vista-type debacle or much worse. A big part of the reason for Vista's poor reception is that after 7 years of NT 5.x and 5 years of XP, people had forgotten just how shaky drivers can be on a new operating system. Doesn't help that you also had the 32-bit to 64-bit transition in the middle, and plenty of vendors did not make 64-bit drivers for things.

It's now been, oh, about 16 years since Microsoft did something that seriously broke driver compatibility. AFAIK, pretty much every device that works on 64-bit Vista will work fine on 64-bit Windows 10/11 (newest feature updates). So no one has had to seriously think about these issues for 15 years.

That won't be true if suddenly the 2024 iteration of Dell's Inspiron 16 is ARM-based where the previous 10 Inspiron 15/16s were x86.

And that's just drivers and that's assuming that you have Apple-quality x86 emulation on the software front. Do you believe that Microsoft will somehow ship 'good enough' x86 emulation, especially when they have 3 decades of 32-bit x86 Windows stuff their emulation needs to be compatible with? Apple had the benefit of Carbon being dead, 32-bit x86 being dead, etc when they designed Rosetta 2... not to mention their experience with PPC to Intel.

Windows on ARM makes sense as a Chromebook-type thing, but if you want a networked device with no local peripherals to run web and Electron apps on, why do you want/need Windows?

Same reason why Apple moved from Intel to ARM.

x86 has the economies of scale of a quarter billion annually.

Android SoC has economies of scale of more than a billion annually.

This will result in cheaper ARM laptops then eventually "desktops" at a better raw performance and performance per watt.

It counter intuitive, unless you understand how levels of integration work.

There is an emotional component to our thinking. ;-)

When Apple released the M1 (5nm) in November 2020, they were a few process nodes ahead (10nm & 7nm) of Intel (14nm from 2014-2020). The shrinking in area is quadratic.

What we are witnessing is the same thing that happened when the microprocessor took over the mainframe/supercomputers.

The perception was the system that took a whole room and had lots of blinking lights had to be the more powerful. However, what was happening was that the microprocessor guys were integrating the same functionality that took lots of separate boards on a mainframe down to a few chips.

There were some very specific use cases where the mainframe had the edge, but for the 99% of the rest of the applications, we were ending up with system on our desktops that were faster than a computer who took a whole room. Heck, you can now buy a GPU for a $1k that is more powerful than the fastest supercomputer from 2000, which cost millions of dollars, took an entire floor in a datacenter, and used almost 1 megawatt.

The microprocessor vendors also had access to larger economies of scale, which meant they could spend more money in development of their designs/tech so they were able to overlap the old large system vendors who had slower development cycles and smaller revenues.

The same thing is now happening with SoCs. They are having larger levels of integration, so they can fit a whole PC into a single chip. Which means that things run faster, with less power, and less cost. And since they are leveraging the mobile/embedded markets that are larger and are growing faster than the traditional PC/datacenter stuff.

The SoC vendors are the ones with access to the larger economies of scale. So they are developing things faster.

How large? SoC vendors who make up 100% of all smartphones shipped

Android (all price points)
- 2021: 1.124 billion units
- 2022: 0.979 billion units

Vs

iPhone ($429-1599)
- 2021: 235.8 million units
- 2022: 226.4 million units

As compared to all x86 vs Apple Silicon Personal Computers shipped

Windows (all price points)
- 2021: 322.2 million units
- 2022: 263.7 million units

Vs

Mac ($999 & up for laptops + $599 & up for desktops)
- 2021: 27.9 million units
- 2022: 28.6 million units

I'll add SoC vendors who make up 100% of all tablets Windows?

Android/Windows (all price points)
- 2021: 110.5 million units
- 2022: 101 million units

vs

iPad ($449-2399)
- 2021: 57.8 million units
- 2022: 61.8 million units

Below are the total units shipped of Macs, iPads & iPhones
- 2021: 321.5 million units
- 2022: 316.8 million units

Apple devices out shipped all Intel/AMD PCs combined. Apple only caters to the top ~20% of any market they enter. Apple leveraged iPhone & iPad SoC R&D to create >90% of Apple Silicon.

<10% R&D for whatever Mac-specific requirements are paid for Mac revenue.
 
Last edited:

MisterAndrew

macrumors 68030
Sep 15, 2015
2,895
2,390
Portland, Ore.
Without anything compelling over the Studio except for PCIe slots that have limited use, the new Mac Pro is DOA.

They should have kept it Intel Xeon with up to 56 cores, up to 4 TB ECC RAM, the newest AMD GPUs, and a new Afterburner card with M2 Ultra.
 
  • Haha
Reactions: AlphaCentauri

Stevenyo

macrumors 6502
Oct 2, 2020
310
478
Doesn't the internal SATA seem odd? Where's the SATA controller? It seems hard to believe that M2 Ultra would include a SATA controller when I don't think any Apple Silicon product has ever had one? Are they using a x1 PCI-E lane for a discrete SATA controller?

(And yes, the spec sheet does say SATA)
I assume it's handled by the PCIe controller, yes. I'd be shocked if there was a SATA controller on the SoC. However it is implemented, it leaves an option for cheap high capacity internal storage at very little cost in time or resources. Kinda odd that it's there at all, except it was there on the 2019 Pro and there was nothing else to go in that space on the board/in the case.
 
  • Like
Reactions: Longplays

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Same reason why Apple moved from Intel to ARM.

x86 has the economies of scale of a quarter billion annually.

Android SoC has economies of scale of more than a billion annually.

This will result in cheaper ARM laptops then eventually "desktops" at a better raw performance and performance per watt.

It counter intuitive, unless you understand how levels of integration work.

There is an emotional component to our thinking. ;-)

When Apple released the M1 (5nm) in November 2020, they were a few process nodes ahead (10nm & 7nm) of Intel (14nm from 2014-2020). The shrinking in area is quadratic.

What we are witnessing is the same thing that happened when the microprocessor took over the mainframe/supercomputers.

The perception was the system that took a whole room and had lots of blinking lights had to be the more powerful. However, what was happening was that the microprocessor guys were integrating the same functionality that took lots of separate boards on a mainframe down to a few chips.

There were some very specific use cases where the mainframe had the edge, but for the 99% of the rest of the applications, we were ending up with system on our desktops that were faster than a computer who took a whole room. Heck, you can now buy a GPU for a $1k that is more powerful than the fastest supercomputer from 2000, which cost millions of dollars, took an entire floor in a datacenter, and used almost 1 megawatt.

The microprocessor vendors also had access to larger economies of scale, which meant they could spend more money in development of their designs/tech so they were able to overlap the old large system vendors who had slower development cycles and smaller revenues.

The same thing is now happening with SoCs. They are having larger levels of integration, so they can fit a whole PC into a single chip. Which means that things run faster, with less power, and less cost. And since they are leveraging the mobile/embedded markets that are larger and are growing faster than the traditional PC/datacenter stuff.

The SoC vendors are the ones with access to the larger economies of scale. So they are developing things faster.

How large? SoC vendors who make up 100% of all smartphones shipped

Android (all price points)
- 2021: 1.124 billion units
- 2022: 0.979 billion units

Vs

iPhone ($429-1599)
- 2021: 235.8 million units
- 2022: 226.4 million units

As compared to all x86 vs Apple Silicon Personal Computers shipped

Windows (all price points)
- 2021: 322.2 million units
- 2022: 263.7 million units

Vs

Mac ($999 & up for laptops + $599 & up for desktops)
- 2021: 27.9 million units
- 2022: 28.6 million units

I'll add SoC vendors who make up 100% of all tablets Windows?

Android/Windows (all price points)
- 2021: 110.5 million units
- 2022: 101 million units

vs

iPad ($449-2399)
- 2021: 57.8 million units
- 2022: 61.8 million units

Below are the total units shipped of Macs, iPads & iPhones
- 2021: 321.5 million units
- 2022: 316.8 million units

Apple devices out shipped all Intel/AMD PCs combined. Apple only caters to the top ~20% of any market they enter. Apple leveraged iPhone & iPad SoC R&D to create >90% of Apple Silicon.

<10% R&D for whatever Mac-specific requirements are paid for Mac revenue.
I think it is obvious that 'smartphones', or rather things that leverage smartphone economies of scale (most definitely including the TSMC-made Apple Silicon chips), are likely to eat PCs, i.e. x86. Just like x86 PCs running *NIX/Linux ate mini-computers, etc. Won't disagree with that.

But... the fundamental question is why Windows should be around for the ride. No one has been writing any new Windows software for over a decade - Chrome is the new OS for pretty much any new project, with perhaps an Electron version if you want to trick people into thinking you have a native desktop app. What you run on Windows is the latest version of richly featured desktop apps that have a history dating back to the early 1990s. Or, maybe not the latest version, because you have a perpetual license to 2013's version and the vendor hasn't come up with enough innovation to justify you to sign up for a subscription licence for 2023's version.

So... what is ARM promising that would i) cause Windows users to want the ARM hardware, ii) make developers of legacy software, most of whom are probably owned by PE firms who laid off 90% of the developers a decade ago, want to invest the time/money into porting legacy Win32 apps to ARM, iii) make Windows users who have resisted every new OS since at least Vista actually want to upgrade all their third-party software and replace at least some of their peripherals in order to embrace Windows on ARM?

I think it's more likely that Windows will stay x64, x64 will fall behind other architectures, and workloads that are not tied to x64/legacy Windows software will move to other operating systems and platforms leveraging smartphone economies of scale as those platforms get better and x86 stagnates. And perhaps this is why Microsoft seems to have decided to embrace the 'Chrome-is-the-new-OS' web technologies garbage.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
I assume it's handled by the PCIe controller, yes. I'd be shocked if there was a SATA controller on the SoC. However it is implemented, it leaves an option for cheap high capacity internal storage at very little cost in time or resources. Kinda odd that it's there at all, except it was there on the 2019 Pro and there was nothing else to go in that space on the board/in the case.
Does the case have a place to mount 2.5"/3.5" SATA drives?
 

bcortens

macrumors 65816
Aug 16, 2007
1,324
1,796
Canada
some preliminary benchmark is just dropped.


Based on my guesstime from my blender usage perspective, even with fully spec'ed Ultra GPU with Metal renderer...still no bueno against NVDIA/optix renderer, even consumer grade RTX 4000 series cards smoked M2 Ultra. Ballparking the M2 Ultra GPU power, probably sitting around with AMD RX7900XT / HIP which around 3600 median score (blender open data benchmark)
Given the lack of RT hardware I wouldn’t expect it to beat them in Optix, however I expect it to be closer to 4000 in open data if the scaling holds (M2 pro is around 1000 and M2 Max around 1950)

From a pure GPU compute perspective the M2 ultra should be extremely competitive, the problem is Apple still doesn’t have dedicated RT cores.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
But... the fundamental question is why Windows should be around for the ride. No one has been writing any new Windows software for over a decade - Chrome is the new OS for pretty much any new project, with perhaps an Electron version if you want to trick people into thinking you have a native desktop app. What you run on Windows is the latest version of richly featured desktop apps that have a history dating back to the early 1990s. Or, maybe not the latest version, because you have a perpetual license to 2013's version and the vendor hasn't come up with enough innovation to justify you to sign up for a subscription licence for 2023's version.

So... what is ARM promising that would i) cause Windows users to want the ARM hardware, ii) make developers of legacy software, most of whom are probably owned by PE firms who laid off 90% of the developers a decade ago, want to invest the time/money into porting legacy Win32 apps to ARM, iii) make Windows users who have resisted every new OS since at least Vista actually want to upgrade all their third-party software and replace at least some of their peripherals in order to embrace Windows on ARM?

I think it's more likely that Windows will stay x64, x64 will fall behind other architectures, and workloads that are not tied to x64/legacy Windows software will move to other operating systems and platforms leveraging smartphone economies of scale as those platforms get better and x86 stagnates. And perhaps this is why Microsoft seems to have decided to embrace the 'Chrome-is-the-new-OS' web technologies garbage.
You may have a point.

My dinosaur brain still thinks that Windows still has a >90% worldwide desktop market share. All this talk of PCIe slots must have gotten to me.

As of May 2023 worldwide desktop market share. Assuming "Unknown" is Chrome OS then they're at par with macOS.

Windows61.87%
OS X18.87%
Unknown13.01%
Chrome OS3.54%
Linux2.69%
FreeBSD0%
 

burgerrecords

macrumors regular
Jun 21, 2020
222
106
We should hold out any verdict until 3rd party benchmarks are out.

verdict yes, but it's highly probability that apple doesn't have an apple silicon solution for this type of user in the medium term or long term (based on cpu core count, gpu extrapoloation from m1 ultra, and ram capacity)
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Given the lack of RT hardware I wouldn’t expect it to beat them in Optix, however I expect it to be closer to 4000 in open data if the scaling holds (M2 pro is around 1000 and M2 Max around 1950)

From a pure GPU compute perspective the M2 ultra should be extremely competitive, the problem is Apple still doesn’t have dedicated RT cores.
9 months ago there was supposed to be RT cores in the iPhone chip. Let us hope that the 3 months from now it shows up.

Then come Q1 2023 it be included in the M3 for eventual deployment on the M3 Ultra in Q1 2024.
 
  • Like
Reactions: bcortens

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
You may have a point.

My dinosaur brain still thinks that Windows still has a >90% worldwide desktop market share. All this talk of PCIe slots must have gotten to me.
And I think it's important to separate out "Win32 (or other native Windows API)" vs "cross-platform things that happen to run on Windows"

There are plenty of people, especially in businesses, running increasingly web-technologies-centric workloads on Chrome on Windows. Those workloads don't really have that much loyalty to Windows - sure, the IT department may be familiar with Active Directory, Windows management tools, etc, but at some point, if you can run Chrome dramatically better or cheaper on something else, the IT department will go along...

Also, there are a lot of hosted Windows desktops out there - Citrix, Windows 365, Amazon Workspaces, other VDI setups, etc. If you're running your legacy Windows app on some virtualized x86 server in a datacenter somewhere, then for as long as Citrix or whoever has a client for some-other-OS-on-ARM, you don't need Windows on your client machine.

Windows may have 60ish% market share or more, but how much of that is primarily doing things that can move easily enough to something else? 20 years ago, it would have been almost none... now... not so sure...
 

Stevenyo

macrumors 6502
Oct 2, 2020
310
478
Does the case have a place to mount 2.5"/3.5" SATA drives?
Yes and no. There are several cages and brackets available to fill the large empty space behind the SoC cooler near the SATA ports, but stock a 2019 and now 2023 MacPro have no provision for mounting SATA devices.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
verdict yes, but it's highly probability that apple doesn't have an apple silicon solution for this type of user in the medium term or long term (based on cpu core count, gpu extrapoloation from m1 ultra, and ram capacity)
I believe Apple's Ultra chips are more than sufficient for 80% or more workstation-class use cases.

It is likely that they gave up on the ultra niche ≤1% use case that require more than 192GB unified memory until Q1 2025 where in it becomes 384GB.
 
Last edited:
  • Like
Reactions: AlphaCentauri

JouniS

macrumors 6502a
Nov 22, 2020
638
399
As of May 2023 worldwide desktop market share. Assuming "Unknown" is Chrome OS then they're at par with macOS.
"Unknown" should be mostly Windows. If you look at the graph, there are many sudden but temporary drops in Windows market share that match the sudden but temporary increases in "Unknown".
 
  • Like
Reactions: Longplays

zach-coleman

macrumors 65816
Apr 10, 2022
1,282
2,264
Seattle, Washington
This thing seems like it exists only out of obligation. It only has the most niche of uses, to the point where I'm kind of shocked they didn't just say to hell with it and leave the ones who need PCI cards high and dry. Like 99% of the time it seems like a Studio would easily cover whatever this Pro could do.

I was really expecting some kind of booster card that let you attach standard RAM sticks as some kind of hacky "external" and slower RAM pool. It would have been better than nothing.
 
  • Like
Reactions: MisterAndrew
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.