Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Longplays

Suspended
May 30, 2023
1,308
1,158
Here's something to mull on by Nvidia cheerleaders.

Nvidia's doing an Apple by creating their own SoC.



It was brought up 1+ year ago in this video.


APUs & SoCs are the direction everyone, even Intel, is going towards to. Users are having a cow over this because it is Apple among the most prominent companies doing it 1st but when companies like AMD/Intel/Nvidia/Qualcomm are eventually doing it for the very same reason of efficiency and performance per watt then they tend to quiet down and point to PassMark scores of these future APUs & SoCs.
 
Last edited:
  • Like
Reactions: Synchro3

Corefile

macrumors 6502a
Sep 24, 2022
754
1,072
I feel like every time not caring about power efficiency is brought up it’s made to be some crazy extreme. Like “no one wants to wire up three phase outlets!” Or “no one wants a jet engine in their cube!”

There’s a lot of room between M2 Ultra and “jet engine in cube.” The old Mac Pro had a TDP of up to 1000 watts without three phase power outlets or jet engine fans.

If we “just” talked about a 500w system that would still be a significant performance jump up from M2 Ultra.
I've been in data centers in Manhattan where the main issues are power consumption, heat dissipation and controlling system noise. They are all critical factors to hardware buying decisions. Performance per watt is a major thing.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
I've been in data centers in Manhattan where the main issues are power consumption, heat dissipation and controlling system noise. They are all critical factors to hardware buying decisions. Performance per watt is a major thing.
I wouldn't blame anyone to think of raw performance within 1.4kW PSU because that's how Intel et al trained us to think and expect.

When I upgraded my ~2 decade home appliances and LED lamps I went with performance per watt. I was able to cut my home's kWh by 2/3rds. Over the decade I'll be paying less even when $/kWh rates increase.

My neighbors who still use hardware that is >2 decades old because they still work will be whining about their power bills.

My ductless mini AC split had the lowest operational noise at the time of purchase. When I visit newer rooms with cheap ACs I actually notice their AC.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
At the end of the day, modern Apple is built around the iPhone. The M-series processors are derivatives of the iPhone‘s class-leading A-series, and are consequently excellent laptop chips. This addresses the majority of Mac users, so no one can fault Apple’s business logic here.

Apple’s higher end desktops are left with stitching two laptop SoCs together, however, as the size of that market doesn’t warrant a dedicated design. This is fine for many applications, but problematic for those that require strong GPU performance. For one thing, it results in too many CPU cores relative to GPU cores. For another, it precludes adding dedicated GPUs at all.

Realistically, this is the best Apple can do without pouring money into a loss-leader. But let’s not kid ourselves that it’s some kind of optimum solution.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Pls point to an online review of a $3999 turn key system without display, keyboard or mouse and has the acoustic performance of 6dB at idle.
Again - there is a wide range of fan noise between "jet engine" and 6 db.

Someone saying they are ok with more fan noise does not mean they want "jet engine." Heck - 30 db would be six times the fan noise and is still "inaudible" on the db scale.

I wouldn't blame anyone to think of raw performance within 1.4kW PSU because that's how Intel et al trained us to think and expect.

When I upgraded my ~2 decade home appliances and LED lamps I went with performance per watt. I was able to cut my home's kWh by 2/3rds. Over the decade I'll be paying less even when $/kWh rates increase.

My neighbors who still use hardware that is >2 decades old because they still work will be whining about their power bills.

My ductless mini AC split had the lowest operational noise at the time of purchase. When I visit newer rooms with cheap ACs I actually notice their AC.

Some people are ok with the increased power draw. That's called choice. Mac Pro owners probably would be ok with higher power draw. People that don't want that power draw can buy a Mac Studio.

Again - choice. People aren't dumb. They weren't put under mind control by Intel. They just want more processing power and are willing to trade more wattage to get it. That's a reasonable thing.

M2 Ultra is not enough processing power for everyone - especially in the Mac Pro forum.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
At the end of the day, modern Apple is built around the iPhone. The M-series processors are derivatives of the iPhone‘s class-leading A-series, and are consequently excellent laptop chips. This addresses the majority of Mac users, so no one can fault Apple’s business logic here.

Apple’s higher end desktops are left with stitching two laptop SoCs together, however, as the size of that market doesn’t warrant a dedicated design. This is fine for many applications, but problematic for those that require strong GPU performance. For one thing, it results in too many CPU cores relative to GPU cores. For another, it precludes adding dedicated GPUs at all.

Realistically, this is the best Apple can do without pouring money into a loss-leader. But let’s not kid ourselves that it’s some kind of optimum solution.
Where Apple goes the industry follows.

AMD, Intel, Nvidia, Qualcomm & Mediatek are all heading towards this direction of efficiency and specialization through SoCs.

Last year AMD Advantage and Intel Deep Link are trying to leverage their CPU & dGPU to improve performance. This would leave Nvidia out if they continue being CPU-less or SoC-less in the PC space.

Nvidia having their own ARM SoC for the desktop would resolve that especially with Microsoft's push for ARM laptops and eventually desktops.

The two SoC stitched together is being done with Nvidia's Grace Hopper SuperChip

grace-hopper-overview.png


grace-cpu-superchip-1.png


Source: https://developer.nvidia.com/blog/nvidia-grace-cpu-superchip-architecture-in-depth/

I watched WSJ's Chili’s New Efficiency Pivot and one thing that they did that mirrors what Apple did with their Mac chip was to standardize their chicken/fish/shrimp batter from 2: tempura + American batter, to just American batter. This lowered the cost to Chilis, sped up service, reduces mistakes and creates a net savings to them.

Naturally people who want the tempura batter complained but it made Chillis more efficient and specialized.
 
Last edited:

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Apple are certainly influential, but it’s not like the rest of the industry blindly follow whatever they do. The 2013 Mac Pro’s concept was ignored in favour of business-as-usual towers, with Apple eventually conceding its flaws and performing a U-turn.

The M2 Ultra is more akin to Grace-Grace. Grace Hopper is the kind of solution Apple might ideally aim for, but that would require an all-new co-processor that’s mostly GPU. Could they ever justify the expense of its development?
 
  • Like
Reactions: prefuse07

Longplays

Suspended
May 30, 2023
1,308
1,158
Apple are certainly influential, but it’s not like the rest of the industry blindly follow whatever they do. The 2013 Mac Pro’s concept was ignored in favour of business-as-usual towers, with Apple eventually conceding its flaws and performing a U-turn.
2000 PowerMac G4 Cube failed and only 150,000 units were sold.

A year later Shuttle PC launches their 1st SFF form, the Shuttle SV24. This kicked off smaller than ATX form factor PCs.

shuttle-terminator-hdd.jpg


Apple then released the 2005-2024 Mac mini, 2013 Mac Pro, 2017 iMac Pro and 2022-2025 Mac Studio.
The M2 Ultra is more akin to Grace-Grace. Grace Hopper is the kind of solution Apple might ideally aim for, but that would require an all-new co-processor that’s mostly GPU. Could they ever justify the expense of its development?
The chip's designed for AI and the cloud. Linus mentioned each Grace Hopper SuperChip costs about $100k.

Minicomputers used to rule the 70s & 80s. When PCs using Intel/AMD chips started showing up companies like DEC, the 2nd largest company in the world, decided to go higher-end rather than compete within the same tech space. DEC did this because their clients wanted them to continue offering high-end products similar to how Intel/AMD/Nvidia push EPYC, Xeon, Core i9 and RTX dGPUs.

DEC eventually sold itself to Compaq. Compaq didn't know what to do with them and eventually sold themselves to HP

Apple isn't afraid to obsolete their own tech and their own products/services. They did that with 68K, PowerPC, Intel and now Apple Silicon. They did this as well with OS9 to macOS. Again with the iPod to the iPhone.

When the high-end users want raw performance and use 1.4kW PSUs Apple evaluates what is the future and it is efficiency and specialization.

The 0.01% demanding macOS on x86 & RTX are... going to be forced to change to laptop/desktop SoC from Nvidia, Intel, AMD, Qualcomm, etc as the economies of scale of a separate CPU & dGPU will suffer.
 
  • Like
Reactions: wegster

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
When the high-end users want raw performance and use 1.4kW PSUs Apple evaluates what is the future and it is efficiency and specialization.

The 0.01% demanding macOS on x86 & RTX are... going to be forced to change to laptop/desktop SoC from Nvidia, Intel, AMD, Qualcomm, etc as the economies of scale of a separate CPU & dGPU will suffer.
This argument is super sloppy.

Apple could have shipped a more powerful SoC. In this very thread you’re talking about Grace - a much more powerful SoC.

M2 Extreme wouldn’t have made everyone happy. But a lot of this is about Apple’s failure to ship M2 Extreme. That has nothing to do with SoCs or efficiency or blah blah blah. Apple failed to ship M2 Extreme. That’s not some sort of wise plan. That’s an engineering failure.

Nvidia shipped powerful ARM SoCs. Apple could too. You’re talking a lot about trends and followers. When is Apple going to follow the trend set by Nvidia that you seem so fond of?
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Nvidia shipped powerful ARM SoCs. Apple could too. You’re talking a lot about trends and followers. When is Apple going to follow the trend set by Nvidia that you seem so fond of?
Today is the 1st time I mentioned Nvidia. ;)

I mentioned them give validity to sticking two Max chips together. If Nvidia does it to then it provide legitimacy to those who like Nvidia.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Today is the 1st time I mentioned Nvidia. ;)

I mentioned them give validity to sticking two Max chips together. If Nvidia does it to then it provide legitimacy to those who like Nvidia.

But you're all over the place. You're talking up Grace which is definitely going into machines that sound like jet engines and suck a lot more power than a Mac Studio. Do you actually think people are putting bare Grace boards into machines and they cool themselves? Those servers are going to need big power supplies and coolers.

Power consumption and fan noise have nothing to do with SoC - they're two different things. Grace proves the legitimacy of what people are asking for. There is a market for higher end SoCs that pull more power and require more cooling. There's a lot of users in between M2 Ultra and Grace that are looking for something.

You've successfully argued against your own point that people aren't asking for higher end SoCs. Grace is the proof. Or is Grace more Intel brainwashing?
 

Longplays

Suspended
May 30, 2023
1,308
1,158
But you're all over the place. You're talking up Grace which is definitely going into machines that sound like jet engines and suck a lot more power than a Mac Studio. Do you actually think people are putting bare Grace boards into machines and they cool themselves? Those servers are going to need big power supplies and coolers.

Power consumption and fan noise have nothing to do with SoC - they're two different things. Grace proves the legitimacy of what people are asking for. There is a market for higher end SoCs that pull more power and require more cooling. There's a lot of users in between M2 Ultra and Grace that are looking for something.

You've successfully argued against your own point that people aren't asking for higher end SoCs. Grace is the proof. Or is Grace more Intel brainwashing?
I point to Grace as an example other companies doing an Apple and no one makes a fuss over it.

If it is good for the goose then it is good for the gander.

Specifically Nvidia doing their own SoC & sticking two chips together via NVLink like what Apple does with UltraFusion.

Mac Studio actively cools the Max and Ultra chips inside of them.

Grace is a AI/cloud chip.

I never said there is no demand on higher end SoCs.

I was even pointing to M3 Extreme as a likely solution to the 192GB RAM limit of M2 Ultra.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
People need to realize benchmarks are not everything. There are certain workflows that even my M1 Ultra with its GPU and Media Encoder scaling issues beat my 13900k and 4090 windows system. But on paper/benchmarks? It doesn't, obviously why is this a shock? My Windows system has 1000 watts, water cooled, overclocked RAM etc compared to my very small M1 Ultra Mac Studio. I'll take my real-world results and save me hours of time on my projects.

Things with Apple are not just CPU/GPU cores anymore. Especially for my line of work with a lot of 3D modeling having access to more memory for the GPU and the media encoders help my video editing workflow. ML can dip in to the neural engine which is separate from the CPU/GPU cores.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
2000 PowerMac G4 Cube failed and only 150,000 units were sold.

A year later Shuttle PC launches their 1st SFF form, the Shuttle SV24. This kicked off smaller than ATX form factor PCs.

I agree the G4 Cube influenced the PC cube craze of the early 2000's; perhaps if the G4 had been a bit more practical, it could have had similar commercial success.

Apple then released the 2005-2024 Mac mini, 2013 Mac Pro, 2017 iMac Pro and 2022-2025 Mac Studio.

Not sure what your point is here. How does this demonstrate Apple's influence on the industry? These are all Apple machines, some more successful than others (two only lasted a single generation). It's also worth noting that despite the iMac's 'success', AIOs have never been a thing in the PC market, because other options exist.

The chip's designed for AI and the cloud. Linus mentioned each Grace Hopper SuperChip costs about $100k.

OK, but you're the one that brought up Grace Hopper. I assumed you were referencing its overall architecture, as shown in your diagram, rather than suggesting Apple makes a $100K Mac Studio.

Minicomputers used to rule the 70s & 80s. When PCs using Intel/AMD chips started showing up companies like DEC, the 2nd largest company in the world, decided to go higher-end rather than compete within the same tech space. DEC did this because their clients wanted them to continue offering high-end products similar to how Intel/AMD/Nvidia push EPYC, Xeon, Core i9 and RTX dGPUs.

DEC eventually sold itself to Compaq. Compaq didn't know what to do with them and eventually sold themselves to HP

OK, but in that story x86 was the underdog, offering just enough performance at a much lower price, displacing the big-iron heavy hitters. ARM (or Risc-V) overall may be on a similar trajectory, but Apple Silicon certainly isn't cheaper than x86.

Apple isn't afraid to obsolete their own tech and their own products/services. They did that with 68K, PowerPC, Intel and now Apple Silicon. They did this as well with OS9 to macOS. Again with the iPod to the iPhone.

I think you'll find that x86 obsoleted 68K and PowerPC, not Apple. In both cases, Apple had no option but to transition (the second time to x86), or become obsolete themselves. Same with OS 9, which was absolutely creaking by the late 90's, after Apple failed several times to come up with a replacement. If they hadn't bought Next, Windows 2000 / XP would have buried Apple. OS 9 didn't even support pre-emptive multitasking.

When the high-end users want raw performance and use 1.4kW PSUs Apple evaluates what is the future and it is efficiency and specialization.

If high-end users want raw performance, why doesn't Apple just give them raw performance? Why are they 'evaluating the future' and providing efficiency and specialisation instead? High end users would happily use less power, but it's not their prime consideration.

Bear in mind that Intel's best desktop / laptop chips outperform ASi whilst being on 10nm, whereas Apple enjoys the advantage of 5nm production. If / when Intel sort out their production, the gap could close pretty rapidly. A 5nm 13900K would be impressive.

The 0.01% demanding macOS on x86 & RTX are... going to be forced to change to laptop/desktop SoC from Nvidia, Intel, AMD, Qualcomm, etc as the economies of scale of a separate CPU & dGPU will suffer.

No one's demanding macOS on x86 or RTX. Ultimately, it's just a cost-benefit of computing power x OS quality x price. Apple have a great OS, and in general make nice machines. Their laptops in particular are compelling. The issue with their desktops are a) a lack of user upgradeability / serviceability, b) a lack of high end GPUs, c) prices that start reasonably (apart from the Mac Pro), but skyrocket with spec upgrades.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Not sure what your point is here. How does this demonstrate Apple's influence on the industry? These are all Apple machines, some more successful than others (two only lasted a single generation).
The point I am getting at is that Apple eventually got SFF & PCIe slot-free commercially successful.
It's also worth noting that despite the iMac's 'success', AIOs have never been a thing in the PC market, because other options exist.
All-in-one (AIO) PCs are one of the major product types of desktops and their shipments will amount to 11.06 million units, accounting for 12.8% of the global desktop volumes in 2022.


By comparison PC workstation that typically use Xeon, EPYC & Threadripper chips shipped <7.7 million in the same year.
OK, but you're the one that brought up Grace Hopper. I assumed you were referencing its overall architecture, as shown in your diagram, rather than suggesting Apple makes a $100K Mac Studio.
The point I was making at is the concept is sound and people critical of it are not forward thinking unless their darling brand does it themselves.
OK, but in that story x86 was the underdog, offering just enough performance at a much lower price, displacing the big-iron heavy hitters. ARM (or Risc-V) overall may be on a similar trajectory, but Apple Silicon certainly isn't cheaper than x86.
Macs weren't cheap using x86 either.

At the same/similar/lower price point as Intel Macs, Apple was able to improve battery life, power consumption, size, weight, thermals, form factor, operational noise and other quality of life improvements lacking with x86 counterparts.
I think you'll find that x86 obsoleted 68K and PowerPC, not Apple. In both cases, Apple had no option but to transition (the second time to x86), or become obsolete themselves. Same with OS 9, which was absolutely creaking by the late 90's, after Apple failed several times to come up with a replacement. If they hadn't bought Next, Windows 2000 / XP would have buried Apple. OS 9 didn't even support pre-emptive multitasking.
I am pointing out that Apple is not loyal or emotional to any specific tech. When they see that those now defunct tech were starting to ascend to the top of their S-Curve they start looking for other alternatives that are at the bottom of theirs and will outpace the incumbent.

susdir2.JPG


perf-trajectory.png


Escaping_Innovators_Dilemma_General_Development.png


Screen Shot 2023-06-26 at 9.16.26 PM.png


Screen Shot 2023-06-26 at 9.26.28 PM.png


Apple's RISC-V job position may be a sign of what is to come 1-2 decades from now.

“There’s an old Wayne Gretzky quote that I love. ‘I skate to where the puck is going to be, not where it has been.’ And we’ve always tried to do that at Apple. Since the very very beginning. And we always will.” - Steve Jobs

If high-end users want raw performance, why doesn't Apple just give them raw performance? Why are they 'evaluating the future' and providing efficiency and specialisation instead? High end users would happily use less power, but it's not their prime consideration.
There are a couple of ways to look at this. Apple failed with the M2 Extreme for fab reasons and they will make another attempt in Q1 2025. The other would be the Mac raw performance users aren't numerous enough to take the effort. No one can satisfy all users all the time. If Apple can address ~99.99% of all typical Mac use cases within their sweet spot then it is good enough. Try again next gen that has been observed to be every 19.5 months.

This is unlike x86 where in they have <7.7 million PC workstations shipped last year. This is why it is often pointed out that maybe it is time to switch?
Bear in mind that Intel's best desktop / laptop chips outperform ASi whilst being on 10nm, whereas Apple enjoys the advantage of 5nm production. If / when Intel sort out their production, the gap could close pretty rapidly. A 5nm 13900K would be impressive.
That isn't the direction of the PC industry as a whole. Today and into the foreseeable future it is efficiency and specialization.

Why is Nvidia producing ARM SoCs?
No one's demanding macOS on x86 or RTX. Ultimately, it's just a cost-benefit of computing power x OS quality x price. Apple have a great OS, and in general make nice machines. Their laptops in particular are compelling. The issue with their desktops are a) a lack of user upgradeability / serviceability, b) a lack of high end GPUs, c) prices that start reasonably (apart from the Mac Pro), but skyrocket with spec upgrades.
Look at any thread concerning the M2 Ultra you will see users demanding i9 and RTX for their ability to run synthetic benchmarks far better than any Ultra chip could hope for.

Whole PC market has workstations being less than 2.62%. Macs shipped 28.6 million units worldwide in 2022. That less than 2.62% translates to ~75,000/year pro desktop Macs. ~60,000/year are Mac Studio & ~15,000/year are Mac Pro.

20% of Mac Pro users who demand swappable parts and separate CPU & dGPU are equivalent to ~3,000 units annually.

This is the sad reality of capitalism at work that keeps people paid for the goods and services they provide as product refreshes and product support.

How does the economies of scale workout for ~0.01% of Macs sold annually? Swappable parts does not translate to any direct sales revenue for Apple. Mac Pro users via OCLP can extend the useful life of their Mac Pro from the industry observed standard of 4-6 years to 14-16 years or even longer.

Someone pointed out that large install sites like those at NASA have scientists enjoying the services of dozens of 2012 Mac Pros until today. That's decade+ by now.

The market is growing smaller per year because cheaper alternatives that are "good enough" have typical tower users moving to these other devices. With the niche approaching that of audiophiles at best and mainframes at worst... what to do?

As early as 2020 Apple abandoned x86 for that reason. This has embolden Qualcomm, Microsoft and other ARM SoC brands to try their luck on Windows 11/12 on ARM another go. Why include Microsoft because they missed the bus with the iPhone, iPad and Macbooks that they're tryng to make up for with Microsoft Surface devices. x86 is not very good in those form factors.

When ARM laptops become as wildly successful as Macbooks what will happen to legacy x86 as a whole? ~80% of all PCs both Mac chips & x86 are laptops. Where will R&D money for Xeons, EPYC and Threadripper come from? ~90% of Mac chip R&D is from the nearing 300 million annual iPhone/iPad sales. That's the equivalent of all x86 shipments from both AMD & Intel excluding Macs. Android ARM R&D is from ~1 billion smartphones shipped annually.

A repeat of this conversation will likely occur before the 2030s but with Windows as those Android ARM SoC are also all about efficiency and specialization.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
I won't be giving up mine any time soon. But It will be interesting to see if next year's MacOS will still support intel machines.

Intel machines were for sale, for tens of thousands of dollars, able to be optioned as the most expensive product Apple has ever sold, only 22 days ago.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
The point I am getting at is that Apple eventually got SFF & PCIe slot-free commercially successful.

On the Mac platform. Which hasn't offered the equivalent of a regular desktop PC for a couple of decades. It's either a laptop, an iMac, a mini or an expensive workstation. Or just buy a Windows PC, as most desktop owners would. Hence a massive games market for Windows, and very little on macOS.

All-in-one (AIO) PCs are one of the major product types of desktops and their shipments will amount to 11.06 million units, accounting for 12.8% of the global desktop volumes in 2022.


So 87.2% of desktops use a tower case, OK.

By comparison PC workstation that typically use Xeon, EPYC & Threadripper chips shipped <7.7 million in the same year.

That's hardly surprising. Workstations are expensive, high-end machines; most AIO PCs are likely cheap Point of Sale machines or similar.

The point I was making at is the concept is sound and people critical of it are not forward thinking unless their darling brand does it themselves.

Macs weren't cheap using x86 either.

You're mixing up the point. x86 was the cheap upstart in the DEC era; ARM would be the equivalent today. Though so far, Apple Silicon is no cheaper than x86; prices have generally gone up (e.g. by £1200 for the new Mac Pro).

At the same/similar/lower price point as Intel Macs, Apple was able to improve battery life, power consumption, size, weight, thermals, form factor, operational noise and other quality of life improvements lacking with x86 counterparts.

x86 had all those advantages when Apple chose it to replace PPC. Will ASi forever hold its lead? Only time will tell.

I am pointing out that Apple is not loyal or emotional to any specific tech. When they see that those now defunct tech were starting to ascend to the top of their S-Curve they start looking for other alternatives that are at the bottom of theirs and will outpace the incumbent*.

*For a while, before x86, which essentially is the computer industry, inevitably catches up and overtakes. As with 68K and PPC. If Apple had used x86 from the beginning, they could have avoided years of constantly trying to play catch up with x86. The key benefit of the x86 transition was that Macs finally had hardware parity with the rest of the computer industry, leaving Apple to concentrate on their key advantage - macOS.

Apple's RISC-V job position may be a sign of what is to come 1-2 decades from now.

OK. Ditto everyone else I guess.

“There’s an old Wayne Gretzky quote that I love. ‘I skate to where the puck is going to be, not where it has been.’ And we’ve always tried to do that at Apple. Since the very very beginning. And we always will.” - Steve Jobs

Yeah, we all love Steve. He was the ultimate salesman.

There are a couple of ways to look at this. Apple failed with the M2 Extreme for fab reasons and they will make another attempt in Q1 2025. The other would be the Mac raw performance users aren't numerous enough to take the effort. No one can satisfy all users all the time. If Apple can address ~99.99% of all typical Mac use cases within their sweet spot then it is good enough. Try again next gen that has been observed to be every 19.5 months.

Sure. Apple always cherry pick what they'll make and what they won't. The spoils of owning the macOS platform.

This is unlike x86 where in they have <7.7 million PC workstations shipped last year. This is why it is often pointed out that maybe it is time to switch?

Already am.

That isn't the direction of the PC industry as a whole. Today and into the foreseeable future it is efficiency and specialization.

And the past too. Process node shrinks have always brought speed and power efficiency benefits. This is how Moore's Law works. Plus, all modern CPUs include FPUs, dedicated h.264/5 decoding, SIMD etc. Again, nothing new.

Why is Nvidia producing ARM SoCs?

I dunno, you tell me.

Look at any thread concerning the M2 Ultra you will see users demanding i9 and RTX for their ability to run synthetic benchmarks* far better than any Ultra chip could hope for.

* And the ability to destroy M2 in real life 3D rendering too, of course. You can check out M2 Ultra vs. Nvidia Optix rendering performance if you like, but you might cry.

Whole PC market has workstations being less than 2.62%. Macs shipped 28.6 million units worldwide in 2022. That less than 2.62% translates to ~75,000/year pro desktop Macs. ~60,000/year are Mac Studio & ~15,000/year are Mac Pro.

20% of Mac Pro users who demand swappable parts and separate CPU & dGPU are equivalent to ~3,000 units annually.

It's a self-fulfilling prophecy. Most people who prioritise the ability to expand their machine either never used Mac OS, or switched to Windows some time ago.

This is the sad reality of capitalism at work that keeps people paid for the goods and services they provide as product refreshes and product support.

How does the economies of scale workout for ~0.01% of Macs sold annually? Swappable parts does not translate to any direct sales revenue for Apple. Mac Pro users via OCLP can extend the useful life of their Mac Pro from the industry observed standard of 4-6 years to 14-16 years or even longer.

Someone pointed out that large install sites like those at NASA have scientists enjoying the services of dozens of 2012 Mac Pros until today. That's decade+ by now.

The market is growing smaller per year because cheaper alternatives that are "good enough" have typical tower users moving to these other devices. With the niche approaching that of audiophiles at best and mainframes at worst... what to do?

Whatever, it ultimately just rules out Macs for customers that demand strong GPU performance at reasonable cost, which includes large institutions such as universities. This is nothing new. Where that leaves the platform long term, we will see.

As early as 2020 Apple abandoned x86 for that reason. This has embolden Qualcomm, Microsoft and other ARM SoC brands to try their luck on Windows 11/12 on ARM another go. Why include Microsoft because they missed the bus with the iPhone, iPad and Macbooks that they're tryng to make up for with Microsoft Surface devices. x86 is not very good in those form factors.

When ARM laptops become as wildly successful as Macbooks what will happen to legacy x86 as a whole? ~80% of all PCs both Mac chips & x86 are laptops. Where will R&D money for Xeons, EPYC and Threadripper come from? ~90% of Mac chip R&D is from the nearing 300 million annual iPhone/iPad sales. That's the equivalent of all x86 shipments from both AMD & Intel excluding Macs. Android ARM R&D is from ~1 billion smartphones shipped annually.

A repeat of this conversation will likely occur before the 2030s but with Windows as those Android ARM SoC are also all about efficiency and specialization.

ARM doesn't offer any inherent advantage; as it gets higher performing, it runs into the same issues that chips using any other ISA have. Apple Silicon benefits from a) Apple's specific CPU / GPU core designs, b) being made on 5nm, c) using unified memory. The last one has pros and cons.
 
  • Like
Reactions: chrash

orionquest

Suspended
Mar 16, 2022
871
791
The Great White North
Some of the heralded performance comes from the Media Engine components in Apple Silicon, greatly accelerating many ProRes operations on the platform. This is not pure CPU computing power, more so specific technology to accelerate tasks well beyond what normal generic CPU architecture can handle. In light of this no generic x86 technology can accommodate this level of acceleration.
Yes it does, but this is very similar to how apps unload imagine processing to the GPU, specifically Nivida. RED recommends processing 8K workflow with Nivida equiped GPU. FCP does have acceleration via Metal for RED, but we are back to how can we accelerate this more by adding more GPU muscle. You can't with Apple.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Doesn't mean they will continue with OS updates, especially now they are all AS.
Apple does what Apple does.

true, but places like europe are a lot more bolshy about software support & related warranties nowdays. For example in Europe, all software comes with a minimum 2 year warranty - all bugs have to be fixed at no cost for 2 years after purchase.
 
  • Like
Reactions: turbineseaplane

orionquest

Suspended
Mar 16, 2022
871
791
The Great White North
I feel like every time not caring about power efficiency is brought up it’s made to be some crazy extreme. Like “no one wants to wire up three phase outlets!” Or “no one wants a jet engine in their cube!”

There’s a lot of room between M2 Ultra and “jet engine in cube.” The old Mac Pro had a TDP of up to 1000 watts without three phase power outlets or jet engine fans.

If we “just” talked about a 500w system that would still be a significant performance jump up from M2 Ultra.
I don't think Apple will go that high of a system, if fabs can continue to be thinner, it will offset the consumption of increased transistors. Plus who knows what other efficiencies they might come up with.

This is a guess, I have no real knowledge of how this works.
 

orionquest

Suspended
Mar 16, 2022
871
791
The Great White North
true, but places like europe are a lot more bolshy about software support & related warranties nowdays. For example in Europe, all software comes with a minimum 2 year warranty - all bugs have to be fixed at no cost for 2 years after purchase.
Wonder if Apple can get around that by stating the OS which was shipped with the hardware was Catalina 10.15, which would be out of date by now. Not sure if currently sold, of 2 weeks ago, would have shipped with Ventura. 🤔
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
I don't think Apple will go that high of a system, if fabs can continue to be thinner, it will offset the consumption of increased transistors. Plus who knows what other efficiencies they might come up with.

This is a guess, I have no real knowledge of how this works.

No, Grace is much higher end than what Apple would ship. But there is plenty of room between Grace and M2 Ultra. My point was simply that M2 Ultra is not the natural end of some SoC performance curve.

The point I am getting at is that Apple eventually got SFF & PCIe slot-free commercially successful.

Did they? Or did they just leave the market?

The Mac Studio is not commercially successful compared to everyone else. It ships a fraction of the traditional PC market.

Withdrawing from a market doesn't magically make the rest of your products some big success.[/QUOTE][/QUOTE]
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.