Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mactinkerlover

macrumors regular
Original poster
Sep 20, 2020
174
113
It seems like the findings for the past several months have been that m1 pro/max have slower raw cpu performance than the latest intel alder lake chips. This interesting because the standard M1 outperformed basically everything in its class, while these new macbooks don't. I have a couple of theories on this.

First of all, the m1 pro/max cpu only consumes 30 watts, while the intel chips consume 45+ watts, sometimes much higher. Where as with the m1, the cpu consumes about 15 watts, which is in line with the rest of the cpus in that class. So that might play a part into it. The m1 pro/max is much much more efficient than the intel architecture.

Second of all, we have to remember that the m1 architecture is technically a year and a half old now, and the alder lake processors just came out a few months ago. So technically if you think about it, apple was still ahead. It's just that apple has not followed up with the M2 yet, which if it's based on A16 cores with big performance jumps, then the performance will probably be ahead of intel again. This is almost in line with the smartphone industry, where apple always runs about 2 years ahead in terms of performance, but the competitors have to consume more power to achieve that performance.

Alright, so those are just my theories. If you all have any thoughts, feel free to respond.
 
You know what? I am absolutely fine with this. The M1 chips are still insanely fast for the power they require. I can do some video editing, light gaming, and all the productivity I want in a machine that barely makes any noise or even gets that hot. I'll take that trade. I have my desktop for the huge loads.
 
It seems like the findings for the past several months have been that m1 pro/max have slower raw cpu performance than the latest intel alder lake chips. This interesting because the standard M1 outperformed basically everything in its class, while these new macbooks don't. I have a couple of theories on this.

First of all, the m1 pro/max cpu only consumes 30 watts, while the intel chips consume 45+ watts, sometimes much higher. Where as with the m1, the cpu consumes about 15 watts, which is in line with the rest of the cpus in that class. So that might play a part into it. The m1 pro/max is much much more efficient than the intel architecture.

Second of all, we have to remember that the m1 architecture is technically a year and a half old now, and the alder lake processors just came out a few months ago. So technically if you think about it, apple was still ahead. It's just that apple has not followed up with the M2 yet, which if it's based on A16 cores with big performance jumps, then the performance will probably be ahead of intel again. This is almost in line with the smartphone industry, where apple always runs about 2 years ahead in terms of performance, but the competitors have to consume more power to achieve that performance.

Alright, so those are just my theories. If you all have any thoughts, feel free to respond.
M1 release date: November 2020
Alder Lake release date: November 2021

“WHY WASTE TIME SAY LOT WORD WHEN FEW WORD DO TRICK”​

-Kevin Malone

 
Using Geekbench, the i9-12900H (Intel's fastest non-overclockable mobile Alder Lake chip) seems to be about 5% faster single-threaded, and 15% faster multi-threaded, than the M1 Pro/Max. So call it roughly 10%. And note that's only if the Intel is plugged in; on battery the M1 is probably faster. Assuming the M2 will be about 10% faster than the M1 (https://www.macworld.com/article/614713/m2-chip-cpu-gpu-speed-performance.html), the M2 Pro/Max CPU should approximately equal Alder Lake. Though the M2 Pro/Max might not be out until 2023, by which time there will be a faster Intel chip.

On other hand, the Intel CPU uses a lot more power. I don't have figures specifically for the CPUs but, according to Notebookcheck, max wall draw (pushing both CPU and GPU, and with screen at max brightness) on a 16.2" M1 Max MBP, 10-core CPU, 32-core GPU, 120 HZ 3456 X 2234 LCD, 32 GB RAM, 1 TB SSD is 135 W. By comparison, on a 17.3" MSI Raider GE76, i9-12900HK (not overclocked), NVIDIA RTX 3080Ti, 120 HZ 3840 X 2160 LCD, 32 GB RAM, 1 TB SSD it's 272 W (twice as much).

[The 135 W and 272 W figures are close to the 140 W and 280 W figures for their respective power adapters.]

So, in summary, at a cost of about double the power consumption, Intel may be able to stay about 10% ahead of Apple in top-end mobile CPU performance.
 
Last edited:
To add to excellent @theorist9 post above, there are three aspects to ADL performance. First, it features redesigned performance cores which are capable of higher peak performance, second, it adds up to eight throughput cores (hence much improved performance in parallel benchmarks) and third, it uses a very aggressive dynamic overclocking profile. In single core, ADL can be up to 20-25% faster than M1, but it achieves it at the expense of much increased power consumption (25-30W compared to M1's 5W). Similar can be said about multi-core. There were some claims that ADL offers the same efficiency as M1, that is, that they have similar performance at the same power consumption level, but I am very sceptical about the methodology used to produce these claims.

On the desktop, yes, no question, ADL is better (at the cost of 3-5x difference in power usage), unless of course you compare it to the Ultra. On the mobile, some premium ADL models will be marginally faster in bursty scenarios than M1 Pro/Max, but there won't be much difference in sustained throughput in practice and the M1 series is still much more efficient. And of course, M1 laptops don't throttle on battery. Overall, I feel like the technological prowess of the ADL series tends to be a bit overestimated. It's a CPU family that trades power usage for more performance. That's about it.

P.S. Note that most ADL superiority claims stem from Cinebench, a benchmark that maximally favours Intel. It scales extremely well of Intel's SMT implementation and the throughput cores, while being hand-optimised for x86 (and specifically, Intel) CPUs. Things are rather different in real software.
 
It seems likely to me that with the M2 Apple will take the performance crown back again, and it’s been a combination of a long cycle between M1 and M2 and a strong performance by Intel that has allowed them to take the performance crown back again.
 
  • Like
Reactions: Jeven Stobs
you’re comparing a CPU to a SOC. And a 2020 release to a 2021 release. So many differences, they are hard to compare. You seem to compare CPU core speed but what about comparing something like M1s NPU performance to Alder Lakes? I hear the M1s is pretty good and Alderlake doesn’t even have one.

Every processor architecture is stuck in a point in time and future releases from other companies will outperform it somehow. Whether in speed or power or other capability.
 
you’re comparing a CPU to a SOC. And a 2020 release to a 2021 release. So many differences, they are hard to compare. You seem to compare CPU core speed but what about comparing something like M1s NPU performance to Alder Lakes? I hear apples is pretty good and Alderlake doesn’t even have one.

Intel has been making SoC's since 2012, and Alder Lake is not an exception. It contains a CPU, a GPU, cache, memory controllers and uses unified memory. So not quite sure what you mean. Of course, Apple chooses to integrate a bigger GPU for some of it's products, but that's just a business decision.

Regarding the NPU, while Intel does not have a dedicated ML coprocessor, they do offer SIMD instructions specialised on ML acceleration. Haven't seen any benchmarks though. Apple's solution is likely faster and certainly much much more energy efficient. Still doesn't help you with general-purpose computation.
 
Also, given that Alder Lake mobile and the M1 are pretty close on a synthetic benchmark like Geekbench, the main performance differentiator will likely be how well-optimized your software is to run on Intel vs. AS (Apple Silicon). I.e., you'd need to take the software you use, and compare it on the two.

To give two opposing examples, Final Cut Pro is vastly faster on AS than Intel, because Apple codesigned the software and hardware. On the other hand, Mathematica remains significantly faster on Intel than AS (Wolframark Benchmark scores of 4.5 on a 2019 Core i9 iMac vs. 3.2 on an M1, which is a difference of 34% when using the mean as the denominator). [Indeed, my 2014 Intel MBP manages 3.0, with is within 6% of the M1.] The whole reason for this is unclear, but it seems it's in part because the Intel MKL (Math Kernel Library) is so highly optimized that whatever is available to replace it on AS isn't as fast.
 
Last edited:
  • Like
Reactions: aeronatis
Can we please have a forum rule that when you just claim something you have to follow that claim up with sources? Not for this thread in particular, but too.
You answered your own question though, didn’t you?
The Intel chips are faster because and if they draw more power. Easy. Nothing we didn’t expect or should be surprised by.
 
Last edited:
Using Geekbench, the i9-12900H (Intel's fastest non-overclockable mobile Alder Lake chip) seems to be about 5% faster single-threaded, and 15% faster multi-threaded, than the M1 Pro/Max. So call it roughly 10%. And note that's only if the Intel is plugged in; on battery the M1 is probably faster. Assuming the M2 will be about 10% faster than the M1 (https://www.macworld.com/article/614713/m2-chip-cpu-gpu-speed-performance.html), the M2 Pro/Max CPU should approximately equal Alder Lake. Though the M2 Pro/Max might not be out until 2023, by which time there will be a faster Intel chip.

On other hand, the Intel CPU uses a lot more power. I don't have figures specifically for the CPUs but, according to Notebookcheck, max wall draw (pushing both CPU and GPU) on a 16.2" M1 Max MBP, 10-core CPU, 32-core GPU, 120 HZ 3456 X 2234 LCD, 32 GB RAM, 1 TB SSD is 135 W. By comparison, on a 17.3" MSI Raider GE76, i9-12900HK (not overclocked), NVIDIA RTX 3080Ti, 120 HZ 3840 X 2160 LCD, 32 GB RAM, 1 TB SSD it's 172 W (twice as much).

So, in summary, at a cost of about double the power consumption, Intel may be able to stay about 10% ahead of Apple in top-end mobile CPU performance.
Follow up question:
You said the Apple consumes 135W and the Intel 172W, how is that double?
I don’t understand energy stuff so, yeah, I don’t understand it.
The rest of your comment I have absolutely no issue with, though. Numbers, reasonable estimates and a link, all one can dream of in this forum :D
 
Follow up question:
You said the Apple consumes 135W and the Intel 172W, how is that double?
I don’t understand energy stuff so, yeah, I don’t understand it.
The rest of your comment I have absolutely no issue with, though. Numbers, reasonable estimates and a link, all one can dream of in this forum :D
Thanks for catching that typo--it's 272 watts for the MSI, not 172! Ugh! I've corrected that in my original post.

And thanks for the kind words—I do try to support my posts, as much as I can, with references.
 
Last edited:
  • Like
Reactions: Jeven Stobs
On other hand, the Intel CPU uses a lot more power. I don't have figures specifically for the CPUs but, according to Notebookcheck, max wall draw (pushing both CPU and GPU) on a 16.2" M1 Max MBP, 10-core CPU, 32-core GPU, 120 HZ 3456 X 2234 LCD, 32 GB RAM, 1 TB SSD is 135 W.

There is no way M1 Max laptop consumes 135W. They probably measured the power draw of the fast charging port. The M1 Max SoC itself is around 80W at its absolute peak, usually much lower. Maybe you can get to 100W running a stress test and with display in the HDR mode, but that’s about it. Running demanding games my 16” uses less than 60W.
 
There is no way M1 Max laptop consumes 135W. They probably measured the power draw of the fast charging port. The M1 Max SoC itself is around 80W at its absolute peak, usually much lower. Maybe you can get to 100W running a stress test and with display in the HDR mode, but that’s about it. Running demanding games my 16” uses less than 60W.
Here's their methodology. It wouldn't make sense for them to measure the fast charging port, because they were measuring power consumption due to operation, not power consumption due to charging. From the pic, it looks like they plug the laptop into the the multimeter's plug adapter, which is then plugged into the wall:
1651480654612.png

And here are their findings. Note that the 135 W is peak consumption, not maximum sustained consumption, which was 115 W under their maximum load stress test (fluctuating between 90 W and 135 W):


1651480722349.png

1651480746893.png
 
Last edited:
Here's their methodology. It wouldn't make sense for them to measure the fast charging port, because they were measuring power consumption due to operation, not power consumption due to charging. From the pic, it looks like they plug the laptop into the the multimeter's plug adapter, which is then plugged into the wall:
View attachment 1999858
And here are their findings:
View attachment 1999859
View attachment 1999860

The problem with this is that that they cannot distinguish between charging and the actual power draw of the hardware. Even if they do the tests on full battery, there will be some fluctuation in power consumption due to battery charging, which can result in short power peaks. For example, look at load maximum for the 16" Pro vs Max. The additional 16 cores of the GPU only draw 20 watts at most, that cannot realistically account for the whopping 45W difference in peak power draw. I wonder if they used the same power adapter for both machines.

Anyway, as they themselves point out, the peak Max package power they have measured is around 90W, which is what should be the relevant number here. The averages are probably more representative than peak numbers anyway. Another interesting observation is that M1 Max package only draws 60W in Witcher 3 (which is in line with my own experience). For an Intel laptop, just the CPU alone will be 30-40 watts, GPU another 60 at least.
 
The problem with this is that that they cannot distinguish between charging and the actual power draw of the hardware. Even if they do the tests on full battery, there will be some fluctuation in power consumption due to battery charging, which can result in short power peaks. For example, look at load maximum for the 16" Pro vs Max. The additional 16 cores of the GPU only draw 20 watts at most, that cannot realistically account for the whopping 45W difference in peak power draw. I wonder if they used the same power adapter for both machines.

Anyway, as they themselves point out, the peak Max package power they have measured is around 90W, which is what should be the relevant number here. The averages are probably more representative than peak numbers anyway. Another interesting observation is that M1 Max package only draws 60W in Witcher 3 (which is in line with my own experience). For an Intel laptop, just the CPU alone will be 30-40 watts, GPU another 60 at least.
If the charging were a significant contributor, then they wouldn't measure only 5.4 W at idle minimum--there would be bigger peaks. Also, what's the effect of High Power Mode, which is only available on the M1 Max? https://support.apple.com/en-us/HT212852
 
If the charging were a significant contributor, then they wouldn't measure only 5.4 W at idle minimum--there would be bigger peaks.

Not necessarily. Suppose you are testing this at 100% battery, and suppose that the laptop uses hybrid power delivery method, where power from both the battery and the power adapter are combined. Under load, much more power will be needed and thus more power will be taken from the battery. If that's the method Apple uses, I can imagine periodical peaks of activity in the charger to replenish the battery few % at a time. In contrast, on idle the battery discharge will be much much lower and won't require this kind of charging.

Of course, all of this is merely speculation. I have no idea how Apple batteries and power management works. Just want to point out that these things can potentially be fairly complicated. And we do know that Apple's battery stuff can be quite sophisticated. It was much easier to test these things back in the day when one could remove the battery altogether :) Of course, most laptops would heavily throttle their performance to prevent spikes...

Also, what's the effect of High Power Mode, which is only available on the M1 Max? https://support.apple.com/en-us/HT212852

Good question :) Could make a difference, don't know.
 
Not necessarily. Suppose you are testing this at 100% battery, and suppose that the laptop uses hybrid power delivery method, where power from both the battery and the power adapter are combined. Under load, much more power will be needed and thus more power will be taken from the battery. If that's the method Apple uses, I can imagine periodical peaks of activity in the charger to replenish the battery few % at a time. In contrast, on idle the battery discharge will be much much lower and won't require this kind of charging.

Of course, all of this is merely speculation. I have no idea how Apple batteries and power management works. Just want to point out that these things can potentially be fairly complicated. And we do know that Apple's battery stuff can be quite sophisticated. It was much easier to test these things back in the day when one could remove the battery altogether :) Of course, most laptops would heavily throttle their performance to prevent spikes...
I think it would be the opposite. If your peak power draw is so large that the 140W PS can't handle it, and you need to take from the battery, that would seem to argue that peak power usage is actually higher than the peaks you're reading from the wall outlet. You'd be limited to 140 W from the wall outlet by the PS, so when you went over that you'd draw from the battery, and then replenish the battery when you came off that peak. But you'd still not be able to exceed the peak power of the PS at the outlet. I.e., I would think the effect of hybrid power delivery would be that the actual peak power usage could be *larger* than the peak power draw measured at the wall outlet....because what you're offering to the computer is both the power from the wall, and the power from the battery.
 
Last edited:
I think it would be the opposite. If your peak power draw is so large that the 140W PS can't handle it, and you need to take from the battery, that would seem to argue that peak power usage is actually higher than the peaks you're reading from the wall outlet. You'd be limited to 140 W from the wall outlet by the PS, so when you went over that you'd draw from the battery, and then replenish the battery when you came off that peak. But you'd still not be able to exceed the peak power of the PS at the outlet. I.e., I would think the effect of hybrid power delivery would be that the actual peak power usage could be *larger* than the peak power draw measured at the wall outlet....because what you're offering to the computer is both the power from the wall, and the power from the battery.
Those numbers look dodgy and it just seems they measured the 135W from the 140W power adaptor. The M1 pro, they measured 90W from the 96W power adaptor.
 
I think it would be the opposite. If your peak power draw is so large that the 140W PS can't handle it, and you need to take from the battery, that would seem to argue that peak power usage is actually higher than the peaks you're reading from the wall outlet. You'd be limited to 140 W from the wall outlet by the PS, so when you went over that you'd draw from the battery, and then replenish the battery when you came off that peak. But you'd still not be able to exceed the peak power of the PS at the outlet. I.e., I would think the effect of hybrid power delivery would be that the actual peak power usage could be *larger* than the peak power draw measured at the wall outlet....because what you're offering to the computer is both the power from the wall, and the power from the battery.

Ah, sorry, I should have been more precise. What I mean is that even if you are plugged in, the system will still use the battery. There is some sort of complicated dance happening, and I don't know the details, but it's not as straightforward as "power adapter in, battery out". But the battery is discharging even as you are plugged in. Again, I have no idea how these things work, but I can imagine that under heavy load, the power system would do quick bursts of charging to fill up the battery again. E.g. imagine that the laptop is combining the battery and the AC power to feed the laptop, and every time the battery looses 2%, there is a power spike to replenish it. Something like that.

The published numbers simply look weird to me. First, as @thunng8 also mentions, there are suspiciously close to the charger limits. Second, M1 chips are fairly well behaved in terms of power consumption, we know what to expect there. If the package uses maximally 80 watts... I have hard time believing that the display and the SSD would use another 60 watts. That's a bit excessive.
 
Those numbers look dodgy and it just seems they measured the 135W from the 140W power adaptor. The M1 pro, they measured 90W from the 96W power adaptor.
The published numbers simply look weird to me. First, as @thunng8 also mentions, there are suspiciously close to the charger limits.
If it were simply a case of them effectively measuring the maximum draw of the adapter, then we wouldn't be seeing numbers climb from a 5.4 W – 16.7 W range (16.4 W avg) at idle to a 90 W – 135 W (115 W avg) range under maximum load.

Their data is poorly presented -- what's listed in the table is confusing; their "load maximum" is the peak for their maximum load, while their "load average" seems to be the average for their medium load. At the same time, measuring laptops is notebookcheck's bread and butter, so you'd think they'd know the difference between measuring the computer vs. measuring the capacity of the adapter.
Second, M1 chips are fairly well behaved in terms of power consumption, we know what to expect there. If the package uses maximally 80 watts... I have hard time believing that the display and the SSD would use another 60 watts. That's a bit excessive.
Where are you getting your 80W figure? Anandtech reported package power at 96 W (and wall at 120 W), and that's without maximally stressing the GPU ("On the GPU side, we lack notable workloads, but GFXBench Aztec High Offscreen ends up with...") plus they didn't specify display brightness. See: https://www.anandtech.com/show/17024/apple-m1-max-performance-review/3

There is a simple way to elminate the battery complication--look at measurements for the M1 Max in the Mac Studio, then add the power consumption of the screen. For this we can look at Apple's own data for max power consumption: 115 W. Could a display at max brightness add 20W?

1651514687196.png
 
Last edited:
It seems like the findings for the past several months have been that m1 pro/max have slower raw cpu performance than the latest intel alder lake chips. This interesting because the standard M1 outperformed basically everything in its class, while these new macbooks don't. I have a couple of theories on this.

First of all, the m1 pro/max cpu only consumes 30 watts, while the intel chips consume 45+ watts, sometimes much higher. Where as with the m1, the cpu consumes about 15 watts, which is in line with the rest of the cpus in that class. So that might play a part into it. The m1 pro/max is much much more efficient than the intel architecture.

Second of all, we have to remember that the m1 architecture is technically a year and a half old now, and the alder lake processors just came out a few months ago. So technically if you think about it, apple was still ahead. It's just that apple has not followed up with the M2 yet, which if it's based on A16 cores with big performance jumps, then the performance will probably be ahead of intel again. This is almost in line with the smartphone industry, where apple always runs about 2 years ahead in terms of performance, but the competitors have to consume more power to achieve that performance.

Alright, so those are just my theories. If you all have any thoughts, feel free to respond.

That is not true. AMD was faster in multiscore than the M1 at release already with a TDP of 35W. The Razer 14" with the RTX 3060 beat the 13" M1 MBP for the same price back then.
 
I don't really understand why anyone cares that Alderlake is 10% faster than M1 Max? I am no expert but I am willing to guess that these differences could be a mute point when considering software optimization which is probably over 10% better on MacOS. Apple software is usually much better optimized than Windows software and the OS is better optimized as well.

It is like some kind of pissing contest of appendage length contest when CPU is just one component of user experience. Go ahead and enjoy your Alderlake system on Windows and I will enjoy my MBP with one of the best screens I have ever seen on a Laptop with equally good speakers. A great chassis design that most Windows laptops can only dream of. One of the largest battery's available on a laptop with equally good performance to battery life ratio. Then there is MacOS which I prefer to Windows. I like Windows too. In fact I run almost every OS available. I have a Chromebook with Ryzen 7 and 16gb ram, I have a nice Surface Laptop 4 with AMD system that I absolutely love and got super cheap, I run Linux on an older HP Laptop, and I have a MBP 16" with M1 Max.

It is fun to use multiple hardware and software configs and if you are always chasing the fastest and best hardware you are always going to be playing a losing game. Hardware changes fast and in 3 years what was fastest today is outclassed by middle of the road newer hardware. So stop with M1 beat by XYZ. These are the fastest and best chips ever in a MacBook and embarrass the Intel hardware they replace. They are not always going to be the fastest in the world but they are the fastest you can get in a Mac and are seriously great machines.

I will not be able to afford to keep buying the latest and greatest and I have a feeling my MBP will last a long time and I will be happy with it's performance for a long time to come not just because of the CPU performance but for all the factors I mentioned. Can we just enjoy what we got and enjoy comparing other hardware and platforms without it always being a competition in ONE metric??
 
I am no expert but I am willing to guess that these differences could be a mute [sic] point when considering software optimization which is probably over 10% better on MacOS.
Agreed. I made that very point a few posts above
Apple software is usually much better optimized than Windows software and the OS is better optimized as well.
Doesn't matter how well-optimized Apple apps are unless you're actually using them, and most people spend most of their time using 3rd-party apps. And, as I mentioned in my post above, when it comes to them, sometimes they're are better optimized for MacOS, and sometimes for Windows.
I don't really understand why anyone cares that Alderlake is 10% faster than M1 Max?....It is like some kind of pissing contest of appendage length contest when CPU is just one component of user experience. Go ahead and enjoy your Alderlake system on Windows and I will enjoy my MBP with one of the best screens I have ever seen on a Laptop with equally good speakers....Can we just enjoy what we got and enjoy comparing other hardware and platforms without it always being a competition in ONE metric??
Dude, cool your jets. First off, no one is complaining about the difference, or arguing in favor of using Alderlake b/c of the difference. Second, we're been looking at more than one metric, including CPU processing, GPU processing, power consumption, and software optimization.

Most importantly, and what you don't get, is simply that it's fun for us to consider and discuss the performance differences of the competing top-end mobile devices (the way car people might have fun comparing 0-60 times). If you don't find that interesting, fine. But why piss in someone else's cereal because of that? I'm sure there are things you find interesting that would be of absolutely no interest to me.

FWIW, I don't think ultimate speed is the most important metric. Usability, including an intuitively-designed UI that enables me to get my work done with minimum friction, is much more important to me, which is why I'm going to stick with MacOS regardless. But I still think speed is fun to talk about.
 
Last edited:
  • Haha
Reactions: Technerd108
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.