Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
Truth be told, I'm really tempted by the 16" MBP, even though I have a M1 14".
The 16" M2 Max MacBook Pro has the highest clock speed (3.7 GHz) for any Apple silicon computer. That makes any computer geek's heart go pitter-patter.
 
  • Like
Reactions: maflynn

DeepIn2U

macrumors G5
May 30, 2002
13,051
6,984
Toronto, Ontario, Canada
Yep, I have noticed that as well. M1 series did well in games but one always had the impression that in more general workflows the performance was a bit wobbly. In applications like Blender M1 GPUs performed significantly lower than expected on their compute capability. But according to these new Blender results the M2 GPUs perform much closer to the expectations: e.g the ~14TFLOPS M2 Max is not far off the ~15TFLOPs mobile 3070. Seems that Apple at least partially addressed whatever was holding their GPUs back. It seems that M2 series is the first Apple GPU designed specifically for desktop use and one that performs like a desktop CPU. Now bring that hardware RT support and Apple Silicon just might become a force to be reckoned with in production rendering.

How can one state "lower than expected on their compute capability" for the M1 GPU's when there is nothing to compare it with in terms of Apple Silicon?! What was the actual expectation considering nothing was available from Apple to compare - even if you think of A series chips Blender isn't made for such chips, so what was teh expecation when nothing was available prior to the M1 series chips to begin with?
 

leman

macrumors Core
Oct 14, 2008
19,521
19,674
How can one state "lower than expected on their compute capability" for the M1 GPU's when there is nothing to compare it with in terms of Apple Silicon?! What was the actual expectation considering nothing was available from Apple to compare - even if you think of A series chips Blender isn't made for such chips, so what was teh expecation when nothing was available prior to the M1 series chips to begin with?

You compare it to other GPUs with similar compute capabilities? Not sure what the confusion is?

For example, M1 Max has the maximal theoretical compute throughput of 10TFLOPs (which has been experimentally verified). But in Blender its performance is very close to 5-6TFLOPs GPUs from other vendors (GTX 1660 Ti, RTX 2060). This can be explained either in terms of less mature software implementation, architectural limitations (whatever the nature of that limitation is), or both.

Now, M2 Max is two times (2x!!!) faster in Blender, while the hardware itself is only 30-40% faster nominally (M1 Max is 10TFLOPs, M2 Max should be around 13TFLOPs). It's Blender score is now very close to 15TFLOPs Nvidia GPUs (like the 3070 mobile), which is a fantastic improvement and shows that M2 family now offers similar (or better) Blender performance per FLOP as Nvidia GPUs. This pretty much rules out the software factor (both M1 Max and M2 Max were benchmarked using the same software), so it must be due to a hardware change in M2 family. No idea what this change was, but it allowed M2 Max to reach the same performance potential in a complex GPU workload as mature desktop GPUs. This is a huge thing for Apple, especially since Apple can get to these performance levels using 2-3x less power.
 

DeepIn2U

macrumors G5
May 30, 2002
13,051
6,984
Toronto, Ontario, Canada
You compare it to other GPUs with similar compute capabilities? Not sure what the confusion is?
The confusion, at least to me, was I guess how I understood the original quote

M1 series did well in games but one always had the impression that in more general workflows the performance was a bit wobbly. In applications like Blender M1 GPUs performed significantly lower than expected on their compute capability
I read the above, and with the part of the sentence in bold, thought the expectation was purely based on the M1 chip series compute capability with no previous apple silicon chip to have a basis for expectation on for capability - not that another chip was being the basis for reference. no worries you've clarified for me and thanks.
 

DeepIn2U

macrumors G5
May 30, 2002
13,051
6,984
Toronto, Ontario, Canada
It shows a lot better when you can put the chip in a properly cooled envelope.
In a Macbook Air, you're pretty much at the mercy of manufacturing process improvements.
In a 16", more GPU cores? More transistors? Sure, keep 'em coming

Properly 'cooled envelope' will significantly change meaning in a few short years when solid-state-cooling becomes perfected and applied.


^ this tech can easily be applied to laptops especially overclocking and gaming mobile cpu's
 
  • Like
Reactions: flobach
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.