Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Oct 14, 2008
19,518
19,669
No one uses AE on CPU no more. Now that Adobe has finally figured out GPU is the future and optimized the most functionalities to squeeze your GPU no Apple computer is faster than latest desktop GPUs. I am not talking about putting titles over a video, doing some one point tracking and other basic stuff.

I don't know anything about video editing, but according to PudgetBench (which is a popular content creation benchmark, from what I understand), M3 Max gets around 1600 points in AE while 4090-equipped desktops with top Intel CPUs get around 1400.

Video processing is a workload where latency matters a lot. You can have all the compute in the world, but if your processing unit has to wait until the data arrives, you can't use it effectively.
 

clv101

macrumors newbie
Jan 19, 2024
23
24
Regarding Apple's GPU performance, is there any indication what they could theoretically achieve with their hardware with say, a 300W power budget? Sure, Nvidia are king of absolute performance, but surely if Apple wanted to build a 300W GPU they could equal or better Nvidia?

Could the Pro one day host a dedicated, high power Apple GPU?
 

leman

macrumors Core
Oct 14, 2008
19,518
19,669
Regarding Apple's GPU performance, is there any indication what they could theoretically achieve with their hardware with say, a 300W power budget? Sure, Nvidia are king of absolute performance, but surely if Apple wanted to build a 300W GPU they could equal or better Nvidia?

It is difficult to spe late about these things since costs can be hard to predict. For example, one way to increase performance would be to boost the clocks, but we don’t know what the realistic clock limits would be or how the rest of the system (caches, memory) would play with it.

Hypothetically though? A theoretical M3 Extreme using 4x M3 Max without any changes in clock or tech should be roughly on par or faster than 4090 (in all workloads except ML training) while consuming under 200W. A hypothetical higher clocked M3 Extreme at 300W would be 20-30% faster than 4090. But that chip would also be very large and extremely expensive.

A hypothetical Ultra-series Apple GPU using higher clocks and faster RAM and architectural improvements I mention in my previous post could be 50% faster than 4090. The big question is whether such a design would be possible without sacrificing the efficiency and performance of mobile GPUs, which are Apples primary market. So far they have been focusing on lower-clocked products and reusing mobile tech for desktop.

Could the Pro one day host a dedicated, high power Apple GPU?

Not really. Fast memory interconnect is the cornerstone of Apple GPU tech, and dedicated GPUs traditionally have a problem with that. It would be possible to build a very wide GPU interconnect, but that would be an extremely expensive and power-hungry enterprise. Nvidia uses this kind of tech in their data center GPUs, but we are talking about systems that cost over 100k.

What I could see for Apple is using an additional interconnect board to link together two boards (each containing an SoC) and presenting them as a single device to the system, but that stuff is tricky to do right, and also very expensive. I doubt that the low volume of Mac Pro Can justify the R&D investment.
 

Lounge vibes 05

macrumors 68040
May 30, 2016
3,862
11,117
"at the same power" is great if worried about laptops

When one wants to get big things done with GPUs, usually it's a desktop and the power efficiency is way down the list.
exactly, and the overlap of people who both want the most powerful desktop GPU, and also only use macOS is very, very slim.

When it comes to the Mac, Apple makes the majority of its money off of laptops. So of course, that’s where they’re going to focus.
 

vladi

macrumors 65816
Jan 30, 2010
1,008
617
I don't know anything about video editing, but according to PudgetBench (which is a popular content creation benchmark, from what I understand), M3 Max gets around 1600 points in AE while 4090-equipped desktops with top Intel CPUs get around 1400.

Video processing is a workload where latency matters a lot. You can have all the compute in the world, but if your processing unit has to wait until the data arrives, you can't use it effectively.

I've never ran PB but from what I can read their benchmark breakdown works like this: The Overall Score is weighted 40% RAM Preview, 40% Render, and 20% Tracking.
 

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
Regarding Apple's GPU performance, is there any indication what they could theoretically achieve with their hardware with say, a 300W power budget? Sure, Nvidia are king of absolute performance, but surely if Apple wanted to build a 300W GPU they could equal or better Nvidia?

Could the Pro one day host a dedicated, high power Apple GPU?
I’m not educated on the topic, but I think that Apple’s GPU isn’t scaling up well. At least, not to the level that would be a world beater.

The M series SoC’s are probably already hitting the limit of die size already (which discrete GPUs don’t have to worry about the rest of the system), and you can only try and glue so many Ultra chips together.

We already know Apple’s gpus punch above their weight, but that doesn’t mean that it’s not already at the point of diminishing returns.

From my perspective, Apple is trying to address the issue through software optimization, rather than throwing raw power at the problem. Which imo is a better strategy.
 
  • Like
Reactions: turbineseaplane

leman

macrumors Core
Oct 14, 2008
19,518
19,669
I’m not educated on the topic, but I think that Apple’s GPU isn’t scaling up well.

What makes you say this? M2 Ultra is 1.8 faster than M2 Max in Blender, that’s just 10% less than perfect scaling. The 4090 is around 2.05x faster than 4070 in Blender, that’s 30% less than perfect scaling. Seems to me Apple and a Nvidia scale about the same here.

The M series SoC’s are probably already hitting the limit of die size already (which discrete GPUs don’t have to worry about the rest of the system), and you can only try and glue so many Ultra chips together.

True, but there are always ways around it. Now, I agree with you that it’s not clear that Apple wants to take the performance crown, but I don’t really see a technological reason they could not.


We already know Apple’s gpus punch above their weight, but that doesn’t mean that it’s not already at the point of diminishing returns.

The last generation just delivered 50% improvement in performance on complex workloads (that’s without RT). This doesn’t sound to me like an architecture that’s anywhere close diminishing returns. And there are still very obvious things to get extra performance they are not doing yet.

From my perspective, Apple is trying to address the issue through software optimization, rather than throwing raw power at the problem. Which imo is a better strategy.

Could you give an example? I mean, M3 GPU improvements are all hardware.
 

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
Sure "When one wants to get big things done with GPUs, usually it's a desktop and the power efficiency is way down the list." But that is just a small subset of all the computing that gets done. Burning watts for marginal gains is a bad thing on all kinds of levels, which is part of why Apple's M series chips are doing so well.
Where exactly are M series chips doing so well? Mac sales are down whilst iPad sales have cratered.
 

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
I don't know anything about video editing, but according to PudgetBench (which is a popular content creation benchmark, from what I understand), M3 Max gets around 1600 points in AE while 4090-equipped desktops with top Intel CPUs get around 1400.

Video processing is a workload where latency matters a lot. You can have all the compute in the world, but if your processing unit has to wait until the data arrives, you can't use it effectively.
According to ChatGPT, AE uses GPU only for some effects so it's hardly a proper tool for evaluating GPU performance. Here is info from ChatGPT:

Adobe After Effects can use the GPU to render some effects, but it mostly relies on the CPU for the overall performance. [1](^4^) To enable GPU acceleration, you need to have a supported GPU and driver, and select the Mercury GPU Acceleration option in the project settings. [2](^4^) You can also check the GPU information in the preferences menu to see if your GPU is recognized by After Effects. [3](^4^)

Depending on your GPU model and driver version, you may need to update or tweak some settings to make it work with After Effects. You can find some helpful tutorials on how to do that in the video results below:

- [After Effects - How to Enable GPU Acceleration](^1^)
- [How to change to GPU rendering in After Effects](^2^)
- [How To Enable GPU Acceleration On The Latest Version Of Adobe After Effects CC 2020](^3^)

I hope this helps you improve your rendering speed and workflow. 😊
 
  • Sad
Reactions: Chuckeee

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
What makes you say this? M2 Ultra is 1.8 faster than M2 Max in Blender, that’s just 10% less than perfect scaling. The 4090 is around 2.05x faster than 4070 in Blender, that’s 30% less than perfect scaling. Seems to me Apple and a Nvidia scale about the same here.



True, but there are always ways around it. Now, I agree with you that it’s not clear that Apple wants to take the performance crown, but I don’t really see a technological reason they could not.




The last generation just delivered 50% improvement in performance on complex workloads (that’s without RT). This doesn’t sound to me like an architecture that’s anywhere close diminishing returns. And there are still very obvious things to get extra performance they are not doing yet.



Could you give an example? I mean, M3 GPU improvements are all hardware.
"The last generation just delivered 50% improvement in performance" ...
Does not the last generation use the new TSMC process (3nm) which allowed Apple to have more transistors on a die? That's not saying much about architecture scalability. They just added more GPU units. Also, arguing that Apple delivers better performance at lower power does not prove much. I am sure there are chips with better performance per watt at, say, 0.1 watts. It's all about the target use case. Apple architecture was developed with phones in mind. No wonder that it has better performance per watt at corresponding power limits while also failing to deliver a solution for workstations and servers (just look at the embarrassment of a workstation that the last Mac Pro is).
 
  • Like
Reactions: gusmula

leman

macrumors Core
Oct 14, 2008
19,518
19,669
"The last generation just delivered 50% improvement in performance" ...
Does not the last generation use the new TSMC process (3nm) which allowed Apple to have more transistors on a die? That's not saying much about architecture scalability. They just added more GPU units. Also, arguing that Apple delivers better performance at lower power does not prove much. I am sure there are chips with better performance per watt at, say, 0.1 watts. It's all about the target use case. Apple architecture was developed with phones in mind. No wonder that it has better performance per watt at corresponding power limits while also failing to deliver a solution for workstations and servers (just look at the embarrassment of a workstation that the last Mac Pro is).


They added two more units (that’s 5% increase) on the Max and removed a unit on the Pro. The frequency did not change, neither did memory bandwidth. Still, the performance on complex compute workload has improved by 50% for the Max and 40% for the Pro - and that’s on Blender 3.6, which did not enable Metal hardware RT for the M3 series.

The transistor budget of 3nm likely went into RT units and larger caches. Regarding it being developed with phones in mind, phones don’t need simultaneous FP and Int execution or complex register shuffles. The roots come from the mobile designs, certainly, but M3 is a huge departure from early M1 which was indeed not much more than a glorified smartphone GPU.
 
  • Like
  • Wow
Reactions: gusmula and heretiq

leman

macrumors Core
Oct 14, 2008
19,518
19,669
According to ChatGPT, AE uses GPU only for some effects so it's hardly a proper tool for evaluating GPU performance.

The discussion was about Intel+Nvidia machines allegedly outperforming Apple in After Effects. Please keep the context in mind.
 
  • Like
Reactions: heretiq

leman

macrumors Core
Oct 14, 2008
19,518
19,669
Where exactly are M series chips doing so well? Mac sales are down whilst iPad sales have cratered.

The entire PC industry is down. But every single Apple Silicon quarter was better than any Intel Mac quarter before it.
 
  • Like
Reactions: heretiq

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
The entire PC industry is down. But every single Apple Silicon quarter was better than any Intel Mac quarter before it.
Mac sales came in at $7.6 billion in Q4 2023, down 34 percent from $11.5 billion in the year-ago quarter. (Source)

According to the latest report from Gartner, global PC shipments totaled 63.3 million units in the fourth quarter of 2023, increasing for the first time after eight consecutive quarters of decline. (Source)

Initially. M-chips did cause an increase in sales but there were two contributing factors:
  • Covid (all PC sales increased)
  • Pent-up demand. Switch to AS was telegraphed well in advance so people stopped buying Intel-based Macs.
In the end, Mac market share is where it was before the switch to AS.
 

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
They added two more units (that’s 5% increase) on the Max and removed a unit on the Pro. The frequency did not change, neither did memory bandwidth. Still, the performance on complex compute workload has improved by 50% for the Max and 40% for the Pro - and that’s on Blender 3.6, which did not enable Metal hardware RT for the M3 series.

The transistor budget of 3nm likely went into RT units and larger caches. Regarding it being developed with phones in mind, phones don’t need simultaneous FP and Int execution or complex register shuffles. The roots come from the mobile designs, certainly, but M3 is a huge departure from early M1 which was indeed not much more than a glorified smartphone GPU.
How do you envision Apple can scale their GPU to NVIDIA level? NVIDIA H200 Tensor Core GPUs
pack up to 288 gigabytes of the HBM3e memory.
 

Pet3rK

macrumors member
May 7, 2023
57
34
Mac sales came in at $7.6 billion in Q4 2023, down 34 percent from $11.5 billion in the year-ago quarter. (Source)

According to the latest report from Gartner, global PC shipments totaled 63.3 million units in the fourth quarter of 2023, increasing for the first time after eight consecutive quarters of decline. (Source)

Initially. M-chips did cause an increase in sales but there were two contributing factors:
  • Covid (all PC sales increased)
  • Pent-up demand. Switch to AS was telegraphed well in advance so people stopped buying Intel-based Macs.
In the end, Mac market share is where it was before the switch to AS.
Your own source say this:

“Apple, Inc. AAPL shipped6,349 units and held the fourth position. AAPL had a market share of 10% in the fourth quarter, up from 9.4% a year ago.”

It has an increased shipments.
 
Last edited:
  • Like
Reactions: heretiq

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
There appear to be a mistake in the source link in the above paragraph
Sorry, it appears I lost "l" in "html" in the link and, for a wrong link, Yahoo jumps to some new page. The link is:

 
  • Like
Reactions: Chuckeee

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
Why does Apple need to compete with data center equipment?
Well, they would probably like to be able to do it if they could but that's obviously not their priority. Still, we were talking about scaling and I used an extreme example. But AS GPUs can't compete with workstation class GPUs either.
 
  • Like
Reactions: gusmula

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
You’re own source say this:

“Apple, Inc. AAPL shipped6,349 units and held the fourth position. AAPL had a market share of 10% in the fourth quarter, up from 9.4% a year ago.”

It has an increased shipments.
I do not want to do the math but it is likely that a year earlier when Mac sales were 34% higher their share was higher too. I am just saying that with the recent sales data it is difficult to claim unequivocally that AS chips made much of a difference and serve as a meaningful differentiator.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,138
1,899
Anchorage, AK
I do not want to do the math but it is likely that a year earlier when Mac sales were 34% higher their share was higher too. I am just saying that with the recent sales data it is difficult to claim unequivocally that AS chips made much of a difference and serve as a meaningful differentiator.

"It is likely" does not equate to proof or even fact, especially in absence of any data related to other manufacturers in the market.
 
  • Like
Reactions: heretiq

Pet3rK

macrumors member
May 7, 2023
57
34
I do not want to do the math but it is likely that a year earlier when Mac sales were 34% higher their share was higher too. I am just saying that with the recent sales data it is difficult to claim unequivocally that AS chips made much of a difference and serve as a meaningful differentiator.
Lower revenue but greater shipments is still valid. And the desktop OS market share and NOT the shipment market share shows the Mac platform is growing.
 
  • Like
Reactions: heretiq

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
"It is likely" does not equate to proof or even fact, especially in absence of any data related to other manufacturers in the market.
Just use the logic. If Mac sales were 34% higher a year ago the only way their market share was not higher a year ago is if PC sales were 34+% higher a year ago too and PC sales never have such fluctuations.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.