Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

RedTheReader

macrumors 6502a
Original poster
Nov 18, 2019
532
1,312
So, there are 2 problems I see here. The first is that although most of AS' advantages in laptops are due to its efficiency, efficiency is much less relevant in desktop machines. Nvidia and AMD proved this with their fall 2020 releases. Their cards drew much more power on the upper-end, with Nvidia going as far as releasing a stock triple-slot card. I was floored; the last time they'd done that was with the Titan Z in 2014. The crucial difference here is that the Titan Z had 2 Titans in it, while the 3090 is only 1 (modern, renamed) Titan. Despite all the efficiency improvement from Kepler to Ampere, Nvidia still realized that they'd be better off cranking the power for their flagship card through the roof. Because people don't care about performance per watt on desktops.

The other problem is the release timing. It doesn't look like the AS Mac Pro will be announced before WWDC or launched before the fall. It's rumored to be coming with M1 graphics cores from late 2020… in the same few months that Nvidia and AMD replace the GPUs that they also launched in late 2020. The M1 lineup's GPU cores should compete well when scaled up to the numbers we'd see in a desktop, but that's if we focus on the 2020 cards… which will be replaced by the time said desktop comes out. This used to be AMD's problem when they'd release competent cards to compete with Nvidia half-way through Nvidia's release cycle. Intel's being criticized for releasing their 1st gen cards in May, only 5 months before Nvidia and AMD release their next-gen ones. I don't want to see Apple one-up them by releasing their 2020-competing machine after the others release their 2022 cards.
 
Last edited:

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
Regarding efficiency, yes, it's not as important on desktops. But having higher efficiency means a smaller machine, quieter, and Apple has room to crank up the wattage for more performance.

An M1 with 40 CPU-core would win against any 64-core AMD chip and any single Intel chip. You'd have to go dual AMD/Intel sockets to beat it. A 128 core AS GPU would only be matched by next-gen Nvidia and AMD GPUs. In addition, Apple's GPUs will probably have up to 256GB of unified memory, which beats the hell out of any current and future Nvidia and AMD GPUs.

Regarding release cycles, I think Apple wants to beat Nvidia and AMD in raw GPU performance but I don't think it's the most important thing for them. As mentioned, AS GPUs have an inherent advantage for many workloads because of their massive unified memory. And, Apple wants to accelerate what their users use, not win benchmarks.

Winning in Mac Pro is not as important as you think. The Mac Pro is more of a "halo" product, not a mass market product.

The bottom line is that AMD, Nvidia, and Intel can't compete currently with Apple in the biggest market, which is laptops.
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
Apple needs to complete their transition from Intel to ASi chips before the end of 2022...

Rumors have the M1 Max only "set up" for dual SoC connections, not quad...

Current M1 Pro/Max SoCs are using LPDDR5 RAM, but LPDDR5X chips (up to 64GB per) are on the horizon...

I speculate a Mac Pro Cube, with a single edge connector (PCIe Gen5 x16) to allow mating with an optional expansion chassis for those who need PCIe cards (excepting GPUs, of course)...

Maybe Apple debuts the Mac Pro Cube at WWDC 2022, with dual M1 Max SoCs allowing up to 512GB LPDDR5X RAM...?

Maybe Apple also acknowledges the "short life span" of the dual (quad) M1 Max SoC configuration by extending a trade-in offer for dual/quad M2 Max SoC configured main logic boards in 2023...?

In a quad SoC configuration, with 64GB LPDDR5X chips, Apple could have up to 1TB RAM with up to 2TB/s UMA bandwidth in a Mac Pro Cube...!
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
The bottom line is that AMD, Nvidia, and Intel can't compete currently with Apple in the biggest market, which is laptops.
That market is not as big as you think. Apple competes in the premium market, and most companies that depend on Windows-only software don't see Macs as a viable option.

So, there are 2 problems I see here.
My main concern is: Has the software that people use in Mac Pro got native Silicon version and Metal support?
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
Did you mean excepting, or accepting? They mean almost the exact opposite thing in this context, so I just wanted to clarify!

Echo Xray...

The Apple silicon Unified Memory Architecture would seem to be non-favorable if a discrete GPU were introduced, but we will know quite a bit more come WWDC 2022, I would hope...?

My main concern is: Has the software that people use in Mac Pro got native Silicon version and Metal support?

I would think Apple might show new versions of Final Cut Pro and Logic Pro alongside the new ASi Mac Pro Cube, those would have to be highly optimized...?

Blender, Resolve, Cinema4D/Redshift, and Octane might also be featured in the keynote...?
 
  • Like
Reactions: iPadified

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
So, there are 2 problems I see here. The first is that although most of AS' advantages in laptops are due to its efficiency, efficiency is much less relevant in desktop machines. Nvidia and AMD proved this with their fall 2020 releases. Their cards drew much more power on the upper-end, with Nvidia going as far as releasing a stock triple-slot card. I was floored; the last time they'd done that was with the Titan Z in 2014. The crucial difference here is that the Titan Z had 2 Titans in it, while the 3090 is only 1 (modern, renamed) Titan. Despite all the efficiency improvement from Kepler to Ampere, Nvidia still realized that they'd be better off cranking the power for their flagship card through the roof. Because people don't care about performance per watt on desktops.

The other problem is the release timing. It doesn't look like the AS Mac Pro will be announced before WWDC or launched before the fall. It's rumored to be coming with M1 graphics cores from late 2020… in the same few months that Nvidia and AMD replace the GPUs that they also launched in late 2020. The M1 lineup's GPU cores should compete well when scaled up to the numbers we'd see in a desktop, but that's if we focus on the 2020 cards… which will be replaced by the time said desktop comes out. This used to be AMD's problem when they'd release competent cards to compete with Nvidia half-way through Nvidia's release cycle. Intel's being criticized for releasing their 1st gen cards in May, only 5 months before Nvidia and AMD release their next-gen ones. I don't want to see Apple one-up them by releasing their 2020-competing machine after the others release their 2022 cards.
You definitely have a point. Aside from laptop designer, it is only supercomputers people that actually bothers with power draw which actually leaves the desktop is strange place. Performance/watt and Performance/$ ratio both have an influence. If Apple has the same performance/$ as AMD/Intel/NVIDIA, Apple has a good chance. Otherwise not.

At some point, the customers will gravitate toward products that are cost efficient. The reason might be as trivial as lower noise, environmental impact and running costs. Look at the development of cars, refrigerators, aircrafts, etc. Strange if the desktops are not following this pattern.
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
At some point, the customers will gravitate toward products that are cost efficient. The reason might be as trivial as lower noise, environmental impact and running costs. Look at the development of cars, refrigerators, aircrafts, etc. Strange if the desktops are not following this pattern.

I would think PC gaming drives the push for more power usage on the desktop; workstation GPUs follow suit to an extent, most workstation-class GPUs are undervolted variants of their gaming counterparts, with more FP thrown in...?
 
  • Like
Reactions: JMacHack

RedTheReader

macrumors 6502a
Original poster
Nov 18, 2019
532
1,312
If Apple has the same performance/$ as AMD/Intel/NVIDIA, Apple has a good chance. Otherwise not.

I considered adding a 3rd paragraph about price when writing my post but chose not to because in the past everyone's insisted that the Mac Pro is for customers who don't care about cost. If cost did matter, though, I think it would be a tough sell for Apple if they stuck with their current pricing for the Mac Pro. Consider what senttoschool said bellow:

A 128 core AS GPU would only be matched by next-gen Nvidia and AMD GPUs.

That's awesome… if Apple prices the system better than they do right now! If it's a $6000 system, I can imagine a 4080 owner just throwing a second 4080 into their system if they need the power or pocketing 35% of the price if they don't. I do think they're going to price it better, though. They won't have to pay the Xeon + ECC tax anymore, after all.
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
Nvidia still realized that they'd be better off cranking the power for their flagship card through the roof.
If that’s their only way to increase performance then they will be screwed in the long run because you can not just increase TDP indefinitely.*
At some point they will realize that they’ll need a proper power/performance ratio to compete** and when that will happen Apple and other efficient architecture in general will be years ahead.

*It’s not just power consumption, it’s also heat, noise, pricier PSU, pricier cooler, larger/pricier case, and BTW power consumption do matter also on desktop or server because if you haven’t noticed this planet is running out of resources and GPUs are among the most environment unfriendly devices.
**In reality they already realized this, that’s why Nvidia is trying to buy ARM, but it seems this is not going to happen (if antitrust works like it should).
 
Last edited:

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
The US$6k entry-level price for a 2019 Mac Pro is too much for the everyday dude; but a lot of that six grand is in the huge custom two-sided main logic board, the 1.6kW PSU, the hulking massive custom CNC machined chassis; not to mention the exorbitantly priced Xeon CPUs...

Apple could reduce their crazy profit margin on RAM & storage to offer a lower entry-point for the ASi Mac Pro, they could also release a M1 Pro/Max-powered Mac mini to allow a lower entry-level price to headless high-performance desktop Macs...

A quad M1/M2 Max configuration could probably do fine with a 600W PSU...?
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
When that will happen Apple and other efficient architecture in general will be years ahead, that’s why Nvidia is trying to buy ARM,
How can a company that designs CPUs help Nvidia to design better GPUs?
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
That market is not as big as you think. Apple competes in the premium market, and most companies that depend on Windows-only software don't see Macs as a viable option.
The laptop market is considerably bigger than any consumer computer market and far bigger than the workstation market that the OP is concerned about.

Heck, AMD has completely deprioritized Threadripper. We still don't have any Zen3 Threadrippers for workstations.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
That's awesome… if Apple prices the system better than they do right now! If it's a $6000 system, I can imagine a 4080 owner just throwing a second 4080 into their system if they need the power or pocketing 35% of the price if they don't. I do think they're going to price it better, though. They won't have to pay the Xeon + ECC tax anymore, after all.
You're not understanding Apple's target audience with the Mac Pro.

There is no way a 4x M1 Max system is going to sell for $6,000. A 40/128 SoC is going to replace the top-of-the-line Mac Pros which costs tens of thousands of dollars.

A 128-core AS GPU + 256GB of unified memory is enterprise-grade. It's not meant to compete with something like a 4080 which will be optimized for gaming.
 

RedTheReader

macrumors 6502a
Original poster
Nov 18, 2019
532
1,312
If that’s their only way to increase performance then they will be screwed in the long run because you can not just increase TDP indefinitely…
That's just it: it's not their only way to increase performance. Nvidia Ampere increased performance dramatically at the same wattage tiers as their previous generation cards. They just realized that they could make an even higher end product by creating a new wattage tier.

A 128-core AS GPU + 256GB of unified memory is enterprise-grade. It's not meant to compete with something like a 4080 which will be optimized for gaming.
I don't think gaming's all that people are going to use a 4080 for! That's probably what I'd use it for [:p], but I've already seen at least one person complain that their existing Blender workflow's faster with a 3080 than with the M1 Max.
 
  • Haha
Reactions: appleArticulate

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
You really think that Nvidia is trying to buy ARM just to compete with CPU?
Actually yes.

Nvidia needs a strong CPU division to compete with AMD, Intel, Qualcomm, and Apple.

Why?

AMD and Intel are winning huge multi-billion dollar government contracts because only they can provide both the CPU and GPU solution together.

And, in the mobile world, it's quite clear that the market is moving towards a SoC approach. This leaves Nvidia vulnerable because most of their revenue is selling discrete GPUs. AMD, Intel, Apple, Qualcomm are creating SoC/APU chips. Nvidia isn't.

Finally, Nvidia doesn't want to rely on AMD and Intel CPUs to couple with their server-grade GPUs. They want to control their own hardware stack.

This all brings it back to ARM. They could license ARM and design their own CPUs, which they already do. But what they really want is a world-class CPU division, and combine it with their own world-class GPU division. They want to control the direction of ARM products. They want to integrate their own GPUs and IPs with ARM, and vice versa. They want to make world-class SoCs, and fully integrated enterprise products. That's why they're trying to buy ARM.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
I don't think gaming's all that people are going to use a 4080 for! That's probably what I'd use it for :)p), but I've already seen at least one person complain that their existing Blender workflow's faster with a 3080 than with the M1 Max.
Sure, gaming isn't the only thing a 4080 will be able to do. Just like how M1 Max GPU isn't only productivity. It can also play games.

But most people buying a 4080 will use it for gaming, maybe some ML/productivity on the side. And some people who buys the M1 Max will use it for gaming, but most will use it for productivity.
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
The world will move beyond the traditional CPU/GPU paradigm, right now the fastest supercomputer use an hybrid ARM SoC and AS demonstrated that you can compete with both CPU and GPU using an integrated system. The largest market is the laptop, Microsoft is trying to do like Apple using Qualcomm, other including Nvidia will try to do the same and move toward a SoC, that’s what I mean when I say they don’t need to buy ARM just to build a CPU.
 

iDron

macrumors regular
Apr 6, 2010
219
252
Efficiency still matters, as cooling is limited even on desktops, so the more efficient your chip is, the higher performance you can sustain over a long period of time.

A 40 core M1 CPU would beat any current Intel/AMD chips. M1 Pro/Max achieve about 12000 on the Geekbench5. The fastest AMD Threadripper is at 25000, the fastest Xeon at 20000.

If you extrapolate these numbers, a 40 core M1 would get above 40000 on Geekbench!

The problem I see with M1 Duo/Quad is more on the GPU side. A M1 Quad with 128 GPU cores would be at 40TFLOPS, which is actually slightly higher than a Nvidia 3090 (which is like ~37TFLOPS I think).

However, many workstations are equipped with multiple GPUs, even Apple ships the Mac Pro with 2 GPUs on order, and you can fit more in. 2 or 3 Nvidia 3090 beat a 128 core M1 GPU by miles. And you can replace those later, you could add eGPUs etc. A Mac Pro with just a SoC can not compete against that, and that is the only market where SoC computers really fail.

This makes me feel like Apple has to support dGPUs/eGPUs. Either they start making their own, meaning just the GPU cores of their SOC offered as an add-on for their Mac Pros, which could also be neat as eGPU for MBP. I've never heard this as a rumor before, but it actually is not that stupid: Apple already makes custom PCIe cards, just think about the Afterburner. And as mentioned above, 128 Apple GPU cores are faster than any current AMD/Nvidia offering. So Apple did so in the past and it would be superior to a AMD/Nvidia solution.
However, I'm not sure if sales numbers in relation to development effort needed really make this worthwhile. More likely is that they keep supporting and offering AMD GPU cards in the Mac Pro going forward.
 
Last edited:

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Actually yes.

Nvidia needs a strong CPU division to compete with AMD, Intel, Qualcomm, and Apple.

Why?

AMD and Intel are winning huge multi-billion dollar government contracts because only they can provide both the CPU and GPU solution together.

And, in the mobile world, it's quite clear that the market is moving towards a SoC approach. This leaves Nvidia vulnerable because most of their revenue is selling discrete GPUs. AMD, Intel, Apple, Qualcomm are creating SoC/APU chips. Nvidia isn't.

Finally, Nvidia doesn't want to rely on AMD and Intel CPUs to couple with their server-grade GPUs. They want to control their own hardware stack.

This all brings it back to ARM. They could license ARM and design their own CPUs, which they already do. But what they really want is a world-class CPU division, and combine it with their own world-class GPU division. They want to control the direction of ARM products. They want to integrate their own GPUs and IPs with ARM, and vice versa. They want to make world-class SoCs, and fully integrated enterprise products. That's why they're trying to buy ARM.

Nvidia wins multi billion government contracts for super computers as well.

They already have a CPU division that designs everything from workstation SOCs to custom cores - they absolutely do not to spend $40 billion to make it world class. The issue extends well beyond desktop, mobile, and server CPUs to AI and automotive.

Yes ARM would benefit from Nvidia’s deep pockets, but questionable that the ecosystem would be healthy as it relies on trust and basically no one trusts Nvidia. To be fair even if it were another ARM customer trust would be very difficult under these circumstances without a lot more guarantees than Nvidia are apparently providing (beyond pinky swears).
 
Last edited:

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Efficiency still matters, as cooling is limited even on desktops, so the more efficient your chip is, the higher performance you can sustain over a long period of time.

A 40 core M1 CPU would beat any current Intel/AMD chips. M1 Pro/Max achieve about 12000 on the Geekbench5. The fastest AMD Threadripper is at 25000, the fastest Xeon at 20000.

If you extrapolate these numbers, a 40 core M1 would get above 40000 on Geekbench!

The problem I see with M1 Duo/Quad is more on the GPU side. A M1 Quad with 128 GPU cores would be at 40TFLOPS, which is actually slightly higher than a Nvidia 3090 (which is like ~37TFLOPS I think).

However, many workstations are equipped with multiple GPUs, even Apple ships the Mac Pro with 2 GPUs on order, and you can fit more in. 2 or 3 Nvidia 3090 beat a 128 core M1 GPU by miles. And you can replace those later, you could add eGPUs etc. A Mac Pro with just a SoC can not compete against that, and that is the only market where SoC computers really fail.

This makes me feel like Apple has to support dGPUs/eGPUs. Either they start making their own, meaning just the GPU cores of their SOC offered as an add-on for their Mac Pros, which could also be neat as eGPU for MBP. I've never heard this as a rumor before, but it actually is not that stupid: Apple already makes custom PCIe cards, just think about the Afterburner. And as mentioned above, 128 Apple GPU cores are faster than any current AMD/Nvidia offering. So Apple did so in the past and it would be superior to a AMD/Nvidia solution.
However, I'm not sure if sales numbers in relation to development effort needed really make this worthwhile. More likely is that they keep supporting and offering AMD GPU cards in the Mac Pro going forward.

There are a lot of different questions surrounding what a big, modular AS Mac Pro might look like. To difficult to tell at this point. The current rumors point to a smaller AS Mac Pro mini coming out first … what everyone else would call … a desktop. Which is actually quite exciting. It’s even rumored that Apple will release one final Intel Mac Pro and there’s evidence of such a thing existing in Xcode (typically a sure sign). So a large modular AS Pro machine may be a little longer away if they keep the form factor.
 
  • Like
Reactions: Tagbert

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
If that’s their only way to increase performance then they will be screwed in the long run because you can not just increase TDP indefinitely.*
At some point they will realize that they’ll need a proper power/performance ratio to compete** and when that will happen Apple and other efficient architecture in general will be years ahead.

*It’s not just power consumption, it’s also heat, noise, pricier PSU, pricier cooler, larger/pricier case, and BTW power consumption do matter also on desktop or server because if you haven’t noticed this planet is running out of resources and GPUs are among the most environment unfriendly devices.
**In reality they already realized this, that’s why Nvidia is trying to buy ARM, but it seems this is not going to happen (if antitrust works like it should).
Tons of efficiency related IP? You really think that Nvidia is trying to buy ARM just to compete with CPU? If they wanted to do that they can simply buy an ARM licence and build their CPU like many other.

You are correct that Nvidia is interested in a lot more IP than just CPU from this deal but that doesn’t really translate to their GPU offerings or getting those more efficient.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
Tons of efficiency related IP?
What IP or expertise can ARM offer to Nvidia to improve its GPUs? CPUs and GPUs are very different.

Nvidia would benefit more from Imagination IP than ARM IP to create more efficient GPUs.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Tons of efficiency related IP? You really think that Nvidia is trying to buy ARM just to compete with CPU? If they wanted to do that they can simply buy an ARM licence and build their CPU like many other.

Nvidia is the only big GPU company without its own CPU IP. It puts it as a significant disadvantage with regards to supercomputing market. Nvidia managed to secure a commanding lead here with their GPUs, but the next step is offering hybrid unified memory solutions and for that Nvidia needs to combine CPU and GPU tech.

That's just it: it's not their only way to increase performance. Nvidia Ampere increased performance dramatically at the same wattage tiers as their previous generation cards.

I am not sure it qualifies as dramatic. We mostly see 15-20% at the same power consumption levels, and most of these wins come from using faster RAM. The difference is much smaller on mobile, e.g. the 3050 is less than 10% faster than a 1650 at the same TDP and its even using a smaller process!

Nvidia would benefit more from Imagination IP than ARM IP to create more efficient GPUs.

I think there is very little chance that Nvidia wants to go for TBDR. They are heavily investing into APIs that leverage the IMR nature of the GPUs.
 
  • Like
Reactions: JMacHack
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.