Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

magbarn

macrumors 68040
Oct 25, 2008
3,016
2,380
Exactly, and he didn't do the testing on battery only.

I'm just calling BS on the guy that said the game will kill a Mac and PC in roughly the same time. At max power draw even an M2 Max will draw way less wattage than an Intel or AMD processor and a mobile high-performance GPU. It doesn't matter what you are doing: There's a max wattage a system can and will draw and there's the battery capacity. Their claim that the Apple silicon power savings is mostly from the efficiency cores and is also bunk; the performance cores also use massively less power. On top of that, when I benchmarked my M1 Max and M2 Max MBP's doing real-world intensive editing on battery they performed almost identically to plugged in.
Someone who won't even bother to try AAA gaming on their macbook is claiming that I'm making stuff up?

https://www.reddit.com/r/macgaming/comments/13bdeh8
https://www.reddit.com/r/BaldursGate3/comments/16pgbge Even with a 96 watt brick his macbook was dead in 2 hours

C'mon man!

high powered gaming can easily draw more power as it's relying on max CPU along with pure raster on the GPU. Many of the productivity apps are using hard coded Apple Silicon encoders, even with "100% GPU" utilization it will likely draw less power.
 

sunny5

macrumors 68000
Jun 11, 2021
1,835
1,706
Someone who won't even bother to try AAA gaming on their macbook is claiming that I'm making stuff up?

https://www.reddit.com/r/macgaming/comments/13bdeh8
https://www.reddit.com/r/BaldursGate3/comments/16pgbge Even with a 96 watt brick his macbook was dead in 2 hours

C'mon man!

high powered gaming can easily draw more power as it's relying on max CPU along with pure raster on the GPU. Many of the productivity apps are using hard coded Apple Silicon encoders, even with "100% GPU" utilization it will likely draw less power.
The battery performance is totally meaningless when people start playing games or CPU/GPU intensive works on MacBook Pro. Clearly, some people just praying Apple Silicon Mac for no reasons.
 

APCX

Suspended
Sep 19, 2023
262
337
The battery performance is totally meaningless when people start playing games or CPU/GPU intensive works on MacBook Pro. Clearly, some people just praying Apple Silicon Mac for no reasons.
Can you answer the question of who claimed the M2 Ultra = 4090?
 
  • Haha
  • Like
Reactions: Romain_H and sunny5

Adult80HD

macrumors 6502a
Nov 19, 2019
701
837
Someone who won't even bother to try AAA gaming on their macbook is claiming that I'm making stuff up?

https://www.reddit.com/r/macgaming/comments/13bdeh8
https://www.reddit.com/r/BaldursGate3/comments/16pgbge Even with a 96 watt brick his macbook was dead in 2 hours

C'mon man!

high powered gaming can easily draw more power as it's relying on max CPU along with pure raster on the GPU. Many of the productivity apps are using hard coded Apple Silicon encoders, even with "100% GPU" utilization it will likely draw less power.
For starters I never said anywhere I wasn't going to "try AAA gaming." My point is it's irrelevant whether you are playing a game or doing some other task if both things are fully pegging the CPU and GPU. A game doesn't miraculously somehow use more than 100% capacity on those items. Full use of the GPU and CPU is full use, regardless. Is that concept so hard for you to grasp? Very few productivity apps are using hard-coded encoders; that's mostly limited to video.

So far you've shared anecdotes and two Reddit posts. I'll give you actual data. I ran an export of 1465 Sony A1 raw files (51megapixels) to jpg files. This task pushes the CPU and GPU very hard--they will both spike up to near continuous full usage. It took my 16" M2 Max (96GB/38GPU) 18 minutes and 26 seconds to do the task plugged in and 18 minutes and 29 seconds to do so on battery--letting MacOS manage power/battery use. That used of 10% of the battery capacity, so extrapolating it would have eaten up the whole battery in three hours or so. In the meantime, the Lenovo (12th Generation Intel® Core™ i7-12800H vPro® Processor with 14 cores; 128GB of RAM; Nvidia RTX 3070 Ti 8GB) I mentioned above also worked the CPU and GPU very hard on the same task; all verified in the task manager. It took it 33 minutes and 41s to finish the task plugged in--handily beat by the M2 Max--and when I put it on battery the same task took 49 minutes and 22s and chewed up 57% of the battery. The M2 Max has basically a 100Whr battery vs. the 90Whr in the Lenovo, so it has about a 10% edge in total life, but basic math, the exact same task took 5x more power to accomplish on the PC. When I tried to turn off the power savings feature on the Lenovo so that it could run on battery at full speed like the M2 Max can, it killed the battery before the task finished. In fact, it took less than 25 minutes to 100% drain the battery.
 
Last edited:

JouniS

macrumors 6502a
Nov 22, 2020
638
399
For starters I never said anywhere I wasn't going to "try AAA gaming." My point is it's irrelevant whether you are playing a game or doing some other task if both things are fully pegging the CPU and GPU. A game doesn't miraculously somehow use more than 100% capacity on those items. Full use of the GPU and CPU is full use, regardless. Is that concept so hard for you to grasp? Very few productivity apps are using hard-coded encoders; that's mostly limited to video.
100% load means that the CPU/GPU is busy 100% of the time. But it could be busy waiting. Waiting for more data from a cache or RAM, or waiting for an instruction to finish, because the next instruction needs the result of the current instruction.

Game developers are notorious for caring about micro-optimizations. They may choose their algorithms poorly, but they will make sure that the implementations are efficient. "Efficient" meaning that they do as much work in a unit of time as possible. Workloads like that are far more power-hungry that ones that are usually busy waiting. If you choose game settings so that neither the CPU nor the GPU is a bottleneck and you don't cap the frame rate, an optimized game will absolutely destroy your battery life.

The peak power draw of an M1/M2 Max MBP is roughly 100 W. That means about an hour of battery life if you are taking full advantage of the performance. If your battery life is substantially longer, you are leaving a lot of potential performance on the table.
 

bcortens

macrumors 65816
Aug 16, 2007
1,324
1,796
Canada
To OP question is NO, M3 Max will not be as fast as the 4090 laptop
On CUDA applications that nvidia probably will have even an larger edge
Yeah, I think they are still at least a generation away on compute workloads, we can see from Blender Open Data that when looking at CUDA performance the 4080 Laptop GPU is about twice as fast as the M2 Max (3900~ vs 1900~ points). I’m doubtful that we will see that gap fully close with the M3 Max but I’m open to being surprised.
 

ggCloud

Suspended
Oct 29, 2023
34
40
For starters I never said anywhere I wasn't going to "try AAA gaming." My point is it's irrelevant whether you are playing a game or doing some other task if both things are fully pegging the CPU and GPU. A game doesn't miraculously somehow use more than 100% capacity on those items. Full use of the GPU and CPU is full use, regardless. Is that concept so hard for you to grasp? Very few productivity apps are using hard-coded encoders; that's mostly limited to video.

So far you've shared anecdotes and two Reddit posts. I'll give you actual data. I ran an export of 1465 Sony A1 raw files (51megapixels) to jpg files. This task pushes the CPU and GPU very hard--they will both spike up to near continuous full usage. It took my 16" M2 Max (96GB/38GPU) 18 minutes and 26 seconds to do the task plugged in and 18 minutes and 29 seconds to do so on battery--letting MacOS manage power/battery use. That used of 10% of the battery capacity, so extrapolating it would have eaten up the whole battery in three hours or so. In the meantime, the Lenovo (12th Generation Intel® Core™ i7-12800H vPro® Processor with 14 cores; 128GB of RAM; Nvidia RTX 3070 Ti 8GB) I mentioned above also worked the CPU and GPU very hard on the same task; all verified in the task manager. It took it 33 minutes and 41s to finish the task plugged in--handily beat by the M2 Max--and when I put it on battery the same task took 49 minutes and 22s and chewed up 57% of the battery. The M2 Max has basically a 100Whr battery vs. the 90Whr in the Lenovo, so it has about a 10% edge in total life, but basic math, the exact same task took 5x more power to accomplish on the PC. When I tried to turn off the power savings feature on the Lenovo so that it could run on battery at full speed like the M2 Max can, it killed the battery before the task finished. In fact, it took less than 25 minutes to 100% drain the battery.
You also have to remember Intel CPUs are not efficient because of design and the use of older nodes.

Nvidias 3000 series used 8nm from Samsung which is not as efficient. The M2 Max is on 5nm from TSMC so it will last longer.
 

sunny5

macrumors 68000
Jun 11, 2021
1,835
1,706
Yeah, I think they are still at least a generation away on compute workloads, we can see from Blender Open Data that when looking at CUDA performance the 4080 Laptop GPU is about twice as fast as the M2 Max (3900~ vs 1900~ points). I’m doubtful that we will see that gap fully close with the M3 Max but I’m open to being surprised.
Almost all 3D based software are CUDA based for a long time which is quite obvious.
 

bcortens

macrumors 65816
Aug 16, 2007
1,324
1,796
Canada
Too bad then as you failed to face the fact. It only confirms that M2 Ultra is as poor as RTX 3060ti.
And yet in the Blender Open Data benchmark I posted above the M2 Ultra is substantially faster than the 3080 and quite close to the 4070…

If they can get up to NVIDIA XX80 series levels and stay there I would say everyone should be very happy. The 4090 is a ridiculous benchmark as it is a monster that consumes more power than three M(n) Ultras.
 
  • Like
Reactions: MRMSFC and dmr727

jujoje

macrumors regular
May 17, 2009
247
288
Almost all 3D based software are CUDA based for a long time which is quite obvious.

You keep saying this and as far as I can tell it's neither obvious nor, in fact, true.

What 3D software are you talking about? It's certainly not Blender, C4D, Houdini, Lightwave, Maya, Max, Modo, Zbrush etc. As far as I can tell 'Almost all 3D based software are CUDA based for a long time' is patently nonsense.
 

sunny5

macrumors 68000
Jun 11, 2021
1,835
1,706
You keep saying this and as far as I can tell it's neither obvious nor, in fact, true.

What 3D software are you talking about? It's certainly not Blender, C4D, Houdini, Lightwave, Maya, Max, Modo, Zbrush etc. As far as I can tell 'Almost all 3D based software are CUDA based for a long time' is patently nonsense.
Ironically, those software you mentioned are still CUDA based. If you run the test, CUDA based GPU run way faster than Apple Silicon GPU. Blender is already a great example. You failed to prove it.
 

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
I wonder if the 40-core M3 Max GPU will be as fast as the 4090 Laptop.
Like everyone else has said, it will depend on the use case.

But for some food for thought, if it’s Creative or AI the M3 Max is probably within a reasonable range to the 4090 running plugged in. Unplugged, the 4090 throttles hard so I can see M3 Max being faster in most of those use cases.

If you are looking to game, with games compatible with Crossweaver I think the M3 Max is probably shockingly good but probably not as good as the 4090.
 

aeronatis

macrumors regular
Sep 9, 2015
198
152
Still runs smoother than my MBP M1Max even when unplugged. Apple Silicon's power savings is mostly in the idle and mundane tasks like browsing or if you're using the media accelerators. Once you're pushing it while gaming or running something general purpose like handbrake CPU only, the power savings advantage is significantly reduced.

I have two laptops, one with RTX 4070 and other with RTX 3070 Ti, and both are pretty much useless when unplugged. Shadow of the Tomb Raider performances runs better at 1600p Highest settings on M2 Max compared to RTX 3070 Ti, however, as soon as the device is unplugged, even the game menu loses its smoothness.

Blender Scanlands scene render takes 04:10 on M2 Max, 01:06 on 3070 Ti; however when unplugged it takes 06:58 on Windows laptop.

Max CPU power draw is around 35 watts on M2 Max which is miles ahead of 12700H & 13700H as both easily consumes more than 100 watts under load, so power saving advantage may reduce, yes, but never on the level of Intel.

I was hoping to see that, but now I am more sceptical.



Not in this generation. There is a distinct possibility that we will get an M3 Extreme Mac Pro though.

I remember Apple implying that they were targetting the same performance for the same chip, laptop or desktop. I do not think they will ever make MacStudio with Max chip more powerfull than MacBook Pro 16" with Max chip.

Seriously, since when Apple Silicon is as great as RTX 40 series? I'm freaking tire of whining from Mac users claiming that M2 Ultra = RTX 4090 without any proofs and even so, that would be only a few great example just for Mac such as GFXbench which Apple claimed that M1 Ultra = RTX 3090 which is not even close. Video software are exceptions cause they have dedicated chips.

M2 Max is competitive to RTX 3070 Ti Laptop GPU or RTX 4070 Laptop GPU with a TGP of 100 watts. However, 4080 Laptop GPU or 4090 Laptop GPU are on a much different level. These are comparisons when Nvidia cards render the scene in Blender using CUDA only. OptiX is of course a different story. For tasks like photo & video editing M2 Max is quite a deal faster than Nvidia laptop cards. One could say the encoders help with these; however, that is no different than Nvidia cards having OptiX. Not to mention MAcBook Pro perform all these tasks at the same speed on battery and with exeptionally low fan noise.
 

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
M2 Max is competitive to RTX 3070 Ti Laptop GPU or RTX 4070 Laptop GPU with a TGP of 100 watts. However, 4080 Laptop GPU or 4090 Laptop GPU are on a much different level. These are comparisons when Nvidia cards render the scene in Blender using CUDA only. OptiX is of course a different story. For tasks like photo & video editing M2 Max is quite a deal faster than Nvidia laptop cards. One could say the encoders help with these; however, that is no different than Nvidia cards having OptiX. Not to mention MAcBook Pro perform all these tasks at the same speed on battery and with exeptionally low fan noise.
I’ve had my 16 MBP M1 Max for a while now (2 years now I think?), and I don’t think I’ve ever heard the fan come on.

Thats not to say that it isn’t but still impressive.
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Ironically, those software you mentioned are still CUDA based. If you run the test, CUDA based GPU run way faster than Apple Silicon GPU. Blender is already a great example. You failed to prove it.
CUDA is what programmers use to run non-graphics workloads on a NVidia GPU. Graphics workloads, like the programs @jujoje mentioned, almost never use CUDA in any way at all - they go through APIs like D3D, Vulkan, or OpenGL instead.
 
Last edited by a moderator:

diamond.g

macrumors G4
Mar 20, 2007
11,435
2,658
OBX
I’ve had my 16 MBP M1 Max for a while now (2 years now I think?), and I don’t think I’ve ever heard the fan come on.

Thats not to say that it isn’t but still impressive.
Yeah the larger laptop has a better cooling solution compared to the smaller one. As long as I am not doing anything the fans stay off on my 14", but if I run DiffusionBee or play BG3 (or Pathfinder Kingdom) they fans can be heard.
 

Adult80HD

macrumors 6502a
Nov 19, 2019
701
837
BTW, I also ran another task that uses the CPU only and no
100% load means that the CPU/GPU is busy 100% of the time. But it could be busy waiting. Waiting for more data from a cache or RAM, or waiting for an instruction to finish, because the next instruction needs the result of the current instruction.

Game developers are notorious for caring about micro-optimizations. They may choose their algorithms poorly, but they will make sure that the implementations are efficient. "Efficient" meaning that they do as much work in a unit of time as possible. Workloads like that are far more power-hungry that ones that are usually busy waiting. If you choose game settings so that neither the CPU nor the GPU is a bottleneck and you don't cap the frame rate, an optimized game will absolutely destroy your battery life.

The peak power draw of an M1/M2 Max MBP is roughly 100 W. That means about an hour of battery life if you are taking full advantage of the performance. If your battery life is substantially longer, you are leaving a lot of potential performance on the table.
Where are you getting this number from? Everything I can find puts the max power draw on an M2 Max MBP at about 35W. That actually ties out pretty close to what I was saying--when I can get Lightroom pegging the CPU and GPU hard, the battery life would be about three hours based on my timings, and that's pretty much a match for a 33W draw for three hours. You're also making a pretty big assumption that somehow Adobe cares less about maxing the use of GPU; I think that's a pretty big stretch.
 

diamond.g

macrumors G4
Mar 20, 2007
11,435
2,658
OBX
BTW, I also ran another task that uses the CPU only and no

Where are you getting this number from? Everything I can find puts the max power draw on an M2 Max MBP at about 35W. That actually ties out pretty close to what I was saying--when I can get Lightroom pegging the CPU and GPU hard, the battery life would be about three hours based on my timings, and that's pretty much a match for a 33W draw for three hours. You're also making a pretty big assumption that somehow Adobe cares less about maxing the use of GPU; I think that's a pretty big stretch.
On my base M1 Max the GPU alone can pull 20-30W. I am not home at the moment but I am pretty sure the whole chip should be pulling more than 33W, unless it is throttling.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
CUDA is what programmers use to run non-graphics workloads on a NVidia GPU. Graphics workloads, like the programs @jujoje mentioned, almost never use CUDA in any way at all - they go through APIs like D3D, Vulkan, or OpenGL instead.
Offline rendering engines use compute APIs. For instance, if you want to compare the performance between various GPUs in Blender Cycles, you can filter them based on the compute API they use.
1699014640677.png

 
  • Like
Reactions: komuh

JouniS

macrumors 6502a
Nov 22, 2020
638
399
Where are you getting this number from? Everything I can find puts the max power draw on an M2 Max MBP at about 35W. That actually ties out pretty close to what I was saying--when I can get Lightroom pegging the CPU and GPU hard, the battery life would be about three hours based on my timings, and that's pretty much a match for a 33W draw for three hours. You're also making a pretty big assumption that somehow Adobe cares less about maxing the use of GPU; I think that's a pretty big stretch.
That 35W is more like the power consumption of a base M2 device under full load, or for the CPU only in a M2 Max. Based on a quick web search, here is an article mentioning 89 W package power for the M2 Max and measuring a bit over 70 W for the entire laptop when running Cinebench: https://www.notebookcheck.net/Apple-M2-Max-Processor-Benchmarks-and-Specs.682771.0.html . And here is a report for the M1 Max exceeding 100 W for the entire laptop when running two benchmarks in parallel: https://www.anandtech.com/show/17024/apple-m1-max-performance-review/3 .

It's also just common sense. The larger MBP has always been designed for about 100 W power consumption. Apple would not waste that capacity by offering only low-powered chips.
 

Kronsteen

macrumors member
Nov 18, 2019
76
66
NB. that this is just for fun!

No doubt there will be some M3 Max Metal benchmark numbers on Geekbench along soon. Meanwhile ....

Taking the ratio of the early GB6 OpenCL numbers for the M3 Max (40 core) to the M2 Max's OpenCL score and applying it to the M2 Max's Metal score suggest an M3 Max Metal figure of somewhere between 150 and 160,000. As previously mentioned, the 4090 Laptop's OpenCL number (sadly, there are no GB6 CUDA numbers) is just under 182,000.

A similar projection for the M3 Ultra (I do realise it doesn't exist in the wild yet ...) gives somewhere in the region of 240,000 for Metal, vs. the 4090's OpenCL score of c.322,000.

I make absolutely no claim about the validity or usefulness of these numbers .... 😬. (Even if I haven't made mistakes with the arithmetic, which is entirely possible.) Although GB OpenCL, Metal and Vulkan scores are supposed to be comparable, I have my doubts about that. Some comparisons using real workloads will be much more interesting than GB scores.

Incidentally, based on the early Metal numbers, it's looking like the M2 to M3 GPU performance increase is c.5%. Adjusting for the number of cores, I wouldn't be too surprised if the M2 Max to M3 Max increase (i.e. for a single GPU core) is closer to 3%.

Following up on my post yesterday ....

I see that there are now a few GB6 Metal figures, for the M3 Max (40 core version, I'm sure), averaging around 157,000. That's about a 9% increase over the 38 core M2 Max (or 3% adjusted for the number of cores).

That's probably consistent with the M3 Ultra hitting somewhere in the region of 240,000 (assuming 80 cores).

(That's based on an M2 Ultra 76 core Metal figure of just over 220,000.)
 
  • Like
Reactions: komuh

komuh

macrumors regular
May 13, 2023
126
113
M2 Ultra is about as fast as the 4070 Ti

M2 Ultra is faster than 3080 and a hair away from 4070

M2 Ultra is faster than 3080 Ti Metal vs Vulkan


We can accept that different benchmarks show different levels of performance - the M2 Ultra is fast, very fast. Software optimizations are going to likely limit the M2 Ultra to 4070 Ti performance at maximum but that is actually quite an achievement considering the power consumption of the M2 Ultra.
Are this blender results legit? I cant see a world where M1 ultra is 2x slower than M2 ultra and 3090 is slower than 4080 laptop version? (or its just CPU as limiting factor as older systems with 3090 will probably have a lot slower CPU and also memory so GPU wont be fully utilize there)

also this seems like a search with higher number of samples -> https://opendata.blender.org/benchm...group_by=device_name&group_by=blender_version and here top M2 ultra is close to RTX 4060 so still pretty ok.

Also Vulkan are like worst scores for NV GPUs, OpenCL still not perfect (not a native api) have around 200 000 for 3080ti so same as avg M2 Ultra score. https://browser.geekbench.com/metal-benchmarks

I think GPU is getting pretty good for M-series but still no near close NV especialy in desktop format.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664
Are this blender results legit? I cant see a world where M1 ultra is 2x slower than M2 ultra and 3090 is slower than 4080 laptop version? (or its just CPU as limiting factor as older systems with 3090 will probably have a lot slower CPU and also memory so GPU wont be fully utilize there)

I think they are legit. M2 Pro/Max have had some internal changes to address scaling bottlenecks, and improvements are consistent between the different SKUs (interestingly base M2 behaves like the M1 family). The Nvidia 4x series introduced the parallel FP pipeline, effectively doubling peak FP FLOPS, along with significant improvements to ray tracing. So I don’t find the results surprising or weird. They are consistent across the family and GPU clocks.
 
  • Like
Reactions: bcortens
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.