Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
Everyone's needs are different, to be sure, but I have felt that my 2012 rMBP was most certainly a capable and fast laptop to the point it replaced my desktop. Countless threads were created on the MBP being a sufficient desktop replacement and many people chimed in and said it had for them.

Is the M1 even more capable, yes, but that doesn't mean the prior generations were not desktop replacements. I get that for some, they need the power of a Mac Pro, or iMac Pro and the MBP didn't have the raw horse power to go toe to toe with those classes of machines where as the M1 variants can. But for most users of desktops, the MBPs of yesteryear were very capable imo
I'm pretty sure he meant that the M1 Max runs virtually the same on a Macbook Pro and a desktop Mac in CPU, GPU, RAM. Hence, it's a proper desktop replacement.

Mobile Intel and AMD/Nvidia chips always throttled or were a shell of their desktop counterparts.

So I agree with @vladi
 

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
I'm pretty sure he meant that the M1 Max runs virtually the same on a Macbook Pro and a desktop Mac in CPU, GPU, RAM. Hence, it's a proper desktop replacement.

Mobile Intel and AMD/Nvidia chips always throttled or were a shell of their desktop counterparts.

So I agree with @vladi
Intel, AMD and Nvidia produce different chips for mobile and desktop computers which makes perfect sense. Apple using the same chips on both platforms makes very little sense from a technical perspective. They are just being scroogy.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,674
Intel, AMD and Nvidia produce different chips for mobile and desktop computers which makes perfect sense. Apple using the same chips on both platforms makes very little sense from a technical perspective. They are just being scroogy.

AMDs desktop and laptop CPUs are physically different, yes. Intels mobile and desktop CPUs use the same chips, just packaged differently.

Apple is not just using the same chip, they use the same configuration across desktop and mobile. That’s why they are being criticized. But to say that they are scroogy is a bit too much IMO. They definitely don’t save on the chips themselves unlike other companies. Pretty much everything about Apple Silicon is more expensive.
 

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
AMDs desktop and laptop CPUs are physically different, yes. Intels mobile and desktop CPUs use the same chips, just packaged differently.

Apple is not just using the same chip, they use the same configuration across desktop and mobile. That’s why they are being criticized. But to say that they are scroogy is a bit too much IMO. They definitely don’t save on the chips themselves unlike other companies. Pretty much everything about Apple Silicon is more expensive.
I am not an expert but I don’t think that Intel mobile chips (especially those in 5-15W range) are identical to desktop chips with 100+W power consumption. They might share some IP but obviously they should have plenty of differences (perhaps, including different tech process too). Apple just has a propensity to use mobile parts in desktop computers (iMacs being one example and this predates Apple silicon). That's why most Apple desktops have been underpowered for many years.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
Intel, AMD and Nvidia produce different chips for mobile and desktop computers which makes perfect sense. Apple using the same chips on both platforms makes very little sense from a technical perspective. They are just being scroogy.
Actually, Intel, AMD, and Nvidia all use the same cores for desktops and mobile. It's just that they cut down or add more cores/clocks depending on the platform.

I'd argue that Apple actually puts more effort into separating mobile and desktop because the M1 Ultra is genuinely a ground-breaking piece of tech that is not available on mobile.
 

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
Actually, Intel, AMD, and Nvidia all use the same cores for desktops and mobile. It's just that they cut down or add more cores/clocks depending on the platform.

I'd argue that Apple actually puts more effort into separating mobile and desktop because the M1 Ultra is genuinely a ground-breaking piece of tech that is not available on mobile.
I am pretty sure that Intel has (or, at least, had) mobile cores. Also, while the cores may have the same architecture, their silicon might still be different. In this discussion Intel May Cancel Meteor Lake Desktop CPUs in Favor of Raptor Lake Refresh, one guy posted this: Also Intel 4 is a half-node and missing the ultra-high performance cells needed for a desktop product. This implies that mobile and desktop designs use different cell libraries.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,674
I am not an expert but I don’t think that Intel mobile chips (especially those in 5-15W range) are identical to desktop chips with 100+W power consumption. They might share some IP but obviously they should have plenty of differences (perhaps, including different tech process too).

You are correct to a certain degree. Intel Alder Lake (12 gen) for example comes in four die variants. The dies used for desktop feature a smaller GPU, while the mobile dies have a bigger GPU. The mobile die also has two fewer physical P-cores (again to save costs since the CPU won’t be able to utilize them properly anyway). And of course, the mobile die has some additional features to better support the fact that the platform is more integrated (like on-die voltage regulators). There is an excellent breakdown of these chips here: https://locuza.substack.com/p/die-walkthrough-alder-lake-sp-and

But most of these differences boil down to cost cutting measures. The basic components are identical. It’s not that they share some IP, they share all IP - identical CPU cores, identical node, interconnects. hierarchies and even layouts, just tweaked to achieve economy of scale given a chips intended use. Even then the differences are blurry at best. For example, Intel also uses the „desktop” ADL-S core for its enthusiast laptop SKU line (HX-series).The rest is just configuration.

My point is that Intel could as well take the „mobile„ ADL-P die and use it on desktop or they could make a „full“ 8+8+96 die and use it on both laptop and desktop. The reason why they don’t is because such a die would be larger and therefore significantly more expensive. It’s cheaper for them to specialize the chip selection a bit. But it’s not a huge difference.

Actually, Intel, AMD, and Nvidia all use the same cores for desktops and mobile. It's just that they cut down or add more cores/clocks depending on the platform.

I'd argue that Apple actually puts more effort into separating mobile and desktop because the M1 Ultra is genuinely a ground-breaking piece of tech that is not available on mobile.

Same cores, yes, but AMD for example uses very different arrangements on desktop (separate core and I/O dies) and mobile (traditional SoC).

Apple currently has no separation between mobile and desktop whatsoever. Their chips have no vertical scaling and they have three different dies that are always configured in the same way, regardless of the product. Ultra is an impressive piece of technology, but it’s technology to connect two Max dies, which are dies optimized and configured for mobile. This is not an area- or cost-efficient way to achieve scalable desktop performance. If they want to compete in the enthusiast desktop space (and it’s unclear that they do!) they need chips capable of vertical scaling, with clock speed differentiation between mobile and desktop. This is especially important in the GPU department.
 
Last edited:

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
Same cores, yes, but AMD for example uses very different arrangements on desktop (separate core and I/O dies) and mobile (traditional SoC).
Yes, that's why I said same cores, not same layouts.

Obviously AMD's chiplet approach does not work on laptops due to power issues.

But I just wanted to make sure that @falainber is aware that AMD, Intel, Nvidia all use a similar strategy as Apple for mobile and desktop.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
I am pretty sure that Intel has (or, at least, had) mobile cores. Also, while the cores may have the same architecture, their silicon might still be different. In this discussion Intel May Cancel Meteor Lake Desktop CPUs in Favor of Raptor Lake Refresh, one guy posted this: Also Intel 4 is a half-node and missing the ultra-high performance cells needed for a desktop product. This implies that mobile and desktop designs use different cell libraries.
Intel shares cores between mobile and desktop. They might use different code names, but they're the same core designs. There have been a few notable exceptions such as Tiger Lake never making it to the desktop and Yonah was laptop only.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
AMD Ryzen 7040 looks like a great contender for M2. Unfortunately, we'll have to wait a while to see third-party reviews.
7040.png


By the way, AMD's nomenclature is ridiculously confusing.
 
Last edited:
  • Like
Reactions: dgdosen

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
AMD Ryzen 7040 looks like a great contender for M2. Unfortunately, we'll have to wait a while to see third-party reviews.
View attachment 2137345

By the way, AMD's nomenclature is ridiculously confusing.

Apple rested on their laurels with the M1, and it took even shorter than I had imagined.
It's such a waste. If they were a bit more collaborative (e.g, didn't drop Nvidia / AMD drivers and allowed for eGPUs), they could have stolen the spot. But noooo. They thought themselves better than anyone and that nobody could catch up to them.

Considering they have been neglecting the software, they were competing on the hardware. And if AMD's hardware gives 30 hours battery life, it becomes harder and harder to justify buying an Apple device.
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
And if AMD's hardware gives 30 hours battery life, it becomes harder and harder to justify buying an Apple device.
I can get 30 hours from my M2 MacBook Air too. Just let it idle with the brightness set to 1 tick above minimum. Uses 1.4 W in this mode. Not very useful though.

AMD only supplies the CPU. The actual battery life is going to depend on the notebook manufacturer. Good luck finding one that actually has a 30 hour battery.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,674
AMD Ryzen 7040 looks like a great contender for M2. Unfortunately, we'll have to wait a while to see third-party reviews.

Do I understand it correctly that AMD bases their advertising around 30% higher Cinebench scores while consuming 2x more power compared to a two years old Apple architecture?

P.S. I totally believe the 30% higher CB23 claim. It's 4.0 ghz vs. a 2.9 ghz CPU with comparable SIMD width and the same number of cores.
 

Kazgarth

macrumors 6502
Oct 18, 2020
318
834
I can get 30 hours from my M2 MacBook Air too. Just let it idle with the brightness set to 1 tick above minimum. Uses 1.4 W in this mode. Not very useful though.

AMD only supplies the CPU. The actual battery life is going to depend on the notebook manufacturer. Good luck finding one that actually has a 30 hour battery.
It's not idle battery time, they specifically mentioned over 30 hours of video playback.
Last time I checked M2 MB Air is rated at 18 hours of video playback on Apple's website.
 

Kazgarth

macrumors 6502
Oct 18, 2020
318
834
Do I understand it correctly that AMD bases their advertising around 30% higher Cinebench scores while consuming 2x more power compared to a two years old Apple architecture?

P.S. I totally believe the 30% higher CB23 claim. It's 4.0 ghz vs. a 2.9 ghz CPU with comparable SIMD width and the same number of cores.

I don't know how you calculated the "2x more power" from.

The AMD 7040 (not 7045) series is rated from 15-45W and that includes a powerful RNDA3 based iGPU.

It's has similar, if not less power characteristics to the M1 Pro they compared it to.
chart-power-scenarios-apple-macbook-pro-14-inch-2021.png
 

leman

macrumors Core
Oct 14, 2008
19,521
19,674
It's not idle battery time, they specifically mentioned over 30 hours of video playback.
Last time I checked M2 MB Air is rated at 18 hours of video playback on Apple's website.

To get 30 hours battery life with a 60Wh battery you need to keep the average power consumption at 2 watts. I don’t doubt AMDs claims - once one reads the fine print (which likely uses a low-res panel, undervolted CPU and overdimensioned battery). AMD was already making similar claims for Zen3+, but no real product comes close.


I don't know how you calculated the "2x more power" from.

The AMD 7040 (not 7045) series is rated from 15-45W and that includes a powerful RNDA3 based iGPU.

It's has similar, if not less power characteristics to the M1 Pro they compared it to.

It really doesn’t. When AMD says 45W TDP they mean 60 watts power consumption. Look around, there are many articles on the internet covering their terminology.

Also, empirical power consumption measurements have been carried out by notebookcheck: https://www.notebookcheck.net/AMD-R...iew-Zen3-beats-Intel-Alder-Lake.623763.0.html
 
  • Like
Reactions: jdb8167

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Do I understand it correctly that AMD bases their advertising around 30% higher Cinebench scores while consuming 2x more power compared to a two years old Apple architecture?
AMD uses Apple's SoC best suited for comparisons: M1 Pro for multiprocessor workloads and M2 neural engine for deep learning workloads.

Which SoC should AMD use for the multi-core comparison? The M2? Is it AMD's fault that M2 Pro doesn't exist?

AMD benchmarks its AI engine against the M2 neural engine for deep learning workloads.
AMD-Ryzen-AI.jpg


AMD benchmarks (for future reference when third-party benchmarks become available)
ryzen 7040.png

Ryzen 9 7940HS: 8 cores/16 threads, 4.0GHz base clock/5.2GHz turbo; 40MB cache; 35-45W TDP
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,521
19,674
AMD uses Apple's SoC best suited for comparisons: M1 Pro for multiprocessor workloads and M2 neural engine for deep learning workloads.

Which SoC should AMD use for the multi-core comparison? The M2? Is it AMD's fault that M2 Pro doesn't exist?

AMD benchmarks its AI engine against the M2 neural engine for deep learning workloads.
View attachment 2137839

They can compare whatever they want with whatever they want. I mean, it’s the job of their marketing department to make sure their product looks good compared to competition and that’s exactly what they are doing. Im less interested in marketing and more in comparing performance and efficiency changes across architectures (as measure of actual technological progress). AMDs marketing slides don’t give me any useful information here.

Of course none of this changes the fact that Zen4 is a great mobile CPU in the x86 space. AMD for a great job here.
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
It's not idle battery time, they specifically mentioned over 30 hours of video playback.
Last time I checked M2 MB Air is rated at 18 hours of video playback on Apple's website.
@leman already answered this but I want to add that my point wasn't that I can get 30 hours on my M2 MacBook Air just that unless you have an actual computer, the 30 hours is meaningless. Is that with a 150 Wh battery? Is it with a display that tops out at 200 NITs?

At idle right now (typing comments in a browser is very close to idle) my M2 MacBook Air with the display backlight set to 2 ticks over middle is using 4.15 Watts. That means the 52.6 Wh battery at this level is good for about 12 hours. Do you really think that the AMD 7040 is going to use less than 4 Watts when playing back video?

I don't doubt that an under clocked 7040 in a notebook with a 97 Wh battery will get very good battery life. Will anyone ship one? If they do, will anyone buy it? We'll see.

Edit: Testing video playback with the Big Buck Bunny movie running full screen in 4K@60 fps on my M2 MacBook Air, the playback adds about 0.6 W of power over just running idle. With the video looping, I go from about 4.15 W to 4.75 W.

So, if AMD has a similar video engine, the fact that you are playing video has little impact on battery life.
 
Last edited:
  • Like
Reactions: pastrychef

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
I haven't found how AMD has calculated the 30+ hours for the new CPUs, but AMD's website claims that AMD Ryzen 7 6800 U (previous generation) can achieve "up to 24 hours of video playback on notebooks".
Based on testing by AMD as of December 14, 2021. Battery life evaluated with hours of continuous 1080p local video playback using the h.264 video codec. Video codec acceleration (including at least the HEVC [H.265], H.264, VP9, and AV1 codecs) is subject to and not operable without inclusion/installation of compatible media players. System configuration: AMD reference motherboards, AMD Ryzen 7 5800U @ 15W and 2x 8GB LPDDR4, AMD Ryzen 7 6800U @ 28W and 2x 8GB LPDDR5, 1080p eDP PSR display with Varibright at 150 nits, Samsung 980 Pro 1TB SSD, WLAN enabled and disconnected, Windows 11 22000.282, BIOS 103BRC1 (5800U) and 090RC6INT (6800U). Video file: 1920x1080, 23.976 FPS, h.264, with 50W/h battery. Actual battery life will vary. RMB-15

By contrast, Apple's website claims that Macbook Air M2 can achieve "up to 18 hours of battery life".
Testing conducted by Apple in May 2022 using preproduction MacBook Air systems with Apple M2, 8-core CPU, 8-core GPU, 8GB of RAM, and 256GB SSD. The Apple TV app movie playback test measures battery life by playing back HD 1080p content with display brightness set to 8 clicks from bottom. Battery life varies by use and configuration. See apple.com/batteries for more information.
 
Last edited:

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Edit: Testing video playback with the Big Buck Bunny movie running full screen in 4K@60 fps on my M2 MacBook Air, the playback adds about 0.6 W of power over just running idle. With the video looping, I go from about 4.15 W to 4.75 W.

So, if AMD has a similar video engine, the fact that you are playing video has little impact on battery life.
Yeah, with proper use of hardware video codecs, fullscreen video playback is one of the easiest battery tests there is. The CPU and GPU cores (which are the big power hungry things on the die) should be nearly idle.
 

name99

macrumors 68020
Jun 21, 2004
2,410
2,317
Yeah, with proper use of hardware video codecs, fullscreen video playback is one of the easiest battery tests there is. The CPU and GPU cores (which are the big power hungry things on the die) should be nearly idle.
The more interesting case is playing a large (too large to be cached locally) internet video. This requires not just the media engine (with a small amount of work by file system and flash IO) but also the radio and network systems to be efficient. Over the years Apple has delegated ever more of this work away from the CPU and on to specific HW.
A case like this should show more interesting differences than local playback.

So there are three levels of testing (and which you see people use depends on whether they are trying to score tribal points or not...)
- use a short/small movie that's repeated. This stresses nothing except media playback; the movie is loaded into RAM and stays there.

- use a large enough local movie to at least keep hitting flash. This tells us something additional about how efficient the flash engine is, and the OS+CPU path that has to keep waking up to interact with the flash HW.

- use a large non-local movie. This tells us how efficient the whole "system" is; not just media but now also the OS+network system, and the radios.

Apple generally tells us about the third case. Don't be at all certain that this is comparable to what AMD or Intel are telling us, check the fine print...
 

alshdavid

macrumors newbie
Jan 10, 2023
4
0
Worth mentioning that the operating system and hardware drivers play a huge factor in battery life. In the end it really comes down to the specific manufacturer and their choice of hardware.

Apple are in the position where they largely control everything that goes into their laptops. From the hardware, drivers to the operating system. They also have a lot of experience writing power efficient OS code. iOS is based on the same kernel as MacOS so it's likely they share a lot of that between them.

Obviously, that level of control over their devices means Apple decides what you can and can't do with their laptops. A good example of that is you can't play video games on MacOS and it is because Apple deliberately says so. In a lot of ways the Apple Silicon platform and the locked-down aspects of the Mac range draws parallels to the game console industry.

PCs on the other hand are cobbled together from hardware created by the efforts of hundreds of thousands of engineers across the globe working independently of each other. The fact that these frankenstein machines even work is honestly kind of remarkable in itself.

With such immense variety, Windows can only do so much to help with power efficiency. All it takes is for a rogue NIC with lazy driver developers to create a driver that doesn't respect the OS request for a low power state to knock an hour off your battery life.

Linux is in an even worse state, when it comes to power efficiency.

I am looking forward to seeing laptops rocking the mobile Zen 4 chips go up against the Apple Silicon range. I expect PC laptops will still have worse battery life - but that's okay, at least you can do things on them.

In reality, when I work on my MBP - M1 Pro, I get about 3-4 hours of battery life out of it doing compilation/recompilation workloads (I'm a software engineer). I sold my personal MBP because I couldn't do anything with it and now carry two computers with me, work and play (where I used to just carry my MBP which was bootcamped).

Ideally we will be able to use Linux on Apple Silicon someday - but Apple refuses to support those efforts so it's unlikely that experience will be competitive within my life time.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.