Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

bill-p

macrumors 68030
Jul 23, 2011
2,929
1,589
Well, sure. I do agree Apple has much to gain if they will include efficient cores along with high performance cores in the Apple Silicon iteration for the Mac.

The thing is that at low load, CPU power consumption won't be too much of a problem, and other things like the screen, the WIFI module, etc... may end up drawing more power than the CPU. So I'd say those are the limiting factors, not Apple's CPU implementation.

Say, if at low load, Apple maintains 3W for the CPU, and the screen draws 2W, and WIFI draws 1W, that's a total of 6W. A 60WHr battery (about on par with the current MacBook Pro 13") may only be able to reach 10 hours with that kind of low load. Dropping low load for the CPU from 3W to 1W (a whopping 67% power consumption drop) will drop total power consumption to about 4W and allow 15 hours of battery life, but not much more. That's why I'm betting on 12 hours being the upper maximum constant screen-on time. A high resolution screen draws a lot more power than some may think. In fact, at 10 hours constant screen-on time, Apple Silicon MacBook would have beaten the current Intel 13" MacBook Pro, since that one can barely get about 6-8 hours in regular use.

So I'm guessing we'll get a faster CPU (maybe by 20-30%?) and at least 10 hours of battery life under normal use, and maybe about 5-6 under heavy load? That sounds ideal already.

Now... a 16" MacBook Pro with Apple Silicon, with that kind of low load... (say 6W total system power consumption), and the same battery capacity as the current 16"? That will last more than 15 hours easily. That's also what I'm looking forward to, more so than the 13".

Edit: also, as a point of comparison, under light usage (web browsing, no video, nothing stressing the GPU or CPU), the current 16" MacBook Pro draws about 8-10W. My 2018 13" Pro also draws about that much. And it makes sense that I'm getting 8 hours of use out of the 16", or about 7 hours out of the 13".
 
  • Like
Reactions: 2Stepfan

UltimateSyn

macrumors 601
Mar 3, 2008
4,967
9,205
Massachusetts
There are credible rumors that Apple is designing its own discrete graphics chip(s). This makes sense out of their claims that Apple Silicon will only use Apple GPUs - in some of their products, they're going to need a lot more GPU than is practical to fit on the same die as the CPU. (Most notably, iMac Pro and Mac Pro.)

Supporting evidence: Imagination Technologies recently announced a new generation of GPU core, B-Series, which supports linking multiple GPU cores (including ones possibly located on different chips) together to improve performance. Apple probably has access to this and other new technologies in B-Series due to an architectural license agreement.
Wrong. There are credible rumors that Apple is developing a custom GPU for the Mac, which of course is true. But that GPU will be part of an SoC, and considered by Apple to be its own separate category: not “discrete” or ”integrated” but an Apple GPU. I’ve been over this ad nauseum and found a lot of direct, specific quotes from Apples’ developer documentation which clearly assert that Apple Silicon Macs will not use discrete GPUs. Here is an example:
Simply not true.

"And to know if a GPU needs to be treated as integrated or discrete, use the isLowPower API. Note that for Apple GPUs isLowPower returns False, which means that you should treat these GPUs in a similar way as discrete GPUs. This is because the performance characteristics of Apple GPUs are in line with discrete ones, not the integrated ones. Despite the property name though, Apple GPUs are also way, way more power-effficient than both integrated and discrete GPUs."

"Intel-based Macs contain a multi-core CPU and many have a discrete GPU ... Machines with a discrete GPU have separate memory for the CPU and GPU. Now, the new Apple Silicon Macs combine all these components into a single system on a chip, or SoC. Building everything into one chip gives the system a unified memory architecture. This means that the CPU and GPU are working over the same memory."
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Wrong. There are credible rumors that Apple is developing a custom GPU for the Mac, which of course is true. But that GPU will be part of an SoC, and considered by Apple to be its own separate category: not “discrete” or ”integrated” but an Apple GPU. I’ve been over this ad nauseum and found a lot of direct, specific quotes from Apples’ developer documentation which clearly assert that Apple Silicon Macs will not use discrete GPUs. Here is an example:

Sigh. I am familiar with all that material, thank you. It is part of why I wrote what I wrote.

(The first quote you gave doesn't even say what you think it does. It's Apple's way of telling devs why they made the isLowPower feature detection API return False on Apple GPUs without explicitly throwing shade at Intel. You see, on Intel Macs, only Intel GPUs return isLowPower=True, and it sounds like they found many Mac apps use this API as a shortcut for deciding when to activate their Intel GPU workarounds. Why do Intel GPUs need workarounds? Because their performance is too low, they often have reduced feature support, and they often suffer from image quality issues. Reading between the lines, Apple wants to make sure that even quick ports with little attention paid to detail get the most out of Apple GPUs.)

You need to read up on how Imagination B-Series does multi-GPU. I believe that Apple could be planning on scaling the performance of an integrated GPU by connecting it to more tile rasterization engines implemented in separate chips.

This would not be the same thing as a traditional discrete GPU, but it would still involve entire chips (or chiplets) dedicated to nothing but GPU functional blocks. It would also retain one of the key properties Apple has promised for their GPUs, fully unified memory. If you want to crucify me for shorthanding that to "discrete", whatever.

Some kind of dedicated GPU silicon is necessary. Without it, Apple cannot build a proper replacement for today's 27" iMac, iMac Pro, or Mac Pro. Especially not the Mac Pro. There is no practical way to pack all that CPU and GPU power into a single die.
 

UltimateSyn

macrumors 601
Mar 3, 2008
4,967
9,205
Massachusetts
Sigh. I am familiar with all that material, thank you. It is part of why I wrote what I wrote.

(The first quote you gave doesn't even say what you think it does. It's Apple's way of telling devs why they made the isLowPower feature detection API return False on Apple GPUs without explicitly throwing shade at Intel. You see, on Intel Macs, only Intel GPUs return isLowPower=True, and it sounds like they found many Mac apps use this API as a shortcut for deciding when to activate their Intel GPU workarounds. Why do Intel GPUs need workarounds? Because their performance is too low, they often have reduced feature support, and they often suffer from image quality issues. Reading between the lines, Apple wants to make sure that even quick ports with little attention paid to detail get the most out of Apple GPUs.)

You need to read up on how Imagination B-Series does multi-GPU. I believe that Apple could be planning on scaling the performance of an integrated GPU by connecting it to more tile rasterization engines implemented in separate chips.

This would not be the same thing as a traditional discrete GPU, but it would still involve entire chips (or chiplets) dedicated to nothing but GPU functional blocks. It would also retain one of the key properties Apple has promised for their GPUs, fully unified memory. If you want to crucify me for shorthanding that to "discrete", whatever.

Some kind of dedicated GPU silicon is necessary. Without it, Apple cannot build a proper replacement for today's 27" iMac, iMac Pro, or Mac Pro. Especially not the Mac Pro. There is no practical way to pack all that CPU and GPU power into a single die.
I know what the first quote is saying, and I guess I bolded too much of it. The real kicker comes in the last part which implies that these 'Apple GPUs' are not going to be considered discrete GPUs - that was what I intended to be the main takeaway with everything else being fairly irrelevant.

Hm, interesting. I guess maybe that's why they're not referring to it as integrated nor discrete? Because it shares memory the way an integrated GPU would, yet has its own separate physical chip the way a discrete chip would? Although, how do we get around the second quote? The one that very specifically, boldly states that everything will be packed onto a single SoC?

Are you absolutely positive they will need separate dedicated GPU silicon to match the performance? I can definitely see the argument for the Mac Pro needing it. What's the possibility that they use one chip with CPU and GPU as an SoC, but create some sort of modular system that allows for the combination of many of those modules? Beyond my Computer Architecture course I never learned advanced specifics on how CPUs and GPUs and their memory systems function, exactly.
 
  • Like
Reactions: 2Stepfan

leman

macrumors Core
Oct 14, 2008
19,517
19,664
Hm, interesting. I guess maybe that's why they're not referring to it as integrated nor discrete? Because it shares memory the way an integrated GPU would, yet has its own separate physical chip the way a discrete chip would? Although, how do we get around the second quote?
We are in a zone where “integrated“ and “discrete” start losing their meaning. It’s a heterogenous system where various computing clusters are connected by fast data networks, have access to the same memory and share caches. Is the GPU in PS5 and new Xbox integrated or discrete? It shares the RAM with the CPU, right?

I think it would make more sence when we look at the implementation details instead of using unhelpful labels.

The one that very specifically, boldly states that everything will be packed onto a single SoC?

I wouldn’t interpret too much into it. Initial batch of Macs will certainly be SoC’s, but for higher-end stuff they might go chiplet route or maybe use some kind of die stacking. The basic implementation characteristics of Apple Silicon would still be the same.

Are you absolutely positive they will need separate dedicated GPU silicon to match the performance? I can definitely see the argument for the Mac Pro needing it. What's the possibility that they use one chip with CPU and GPU as an SoC, but create some sort of modular system that allows for the combination of many of those modules? Beyond my Computer Architecture course I never learned advanced specifics on how CPUs and GPUs and their memory systems function, exactly.
The problem is the chip size. Apples performance per watt is second to none, but if you want to reach Mac Pro levels, you will need a lot of GPU cores. It might be uneconomical to produce. Splitting the system into multiple chips (still sharing memory) could be a good way to make it work though.

One important point is that Apple wants to kill the dGPU as a concept. They want to build a system where data copies between CPU, GPU and any other processor are completely eliminated.
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Hm, interesting. I guess maybe that's why they're not referring to it as integrated nor discrete? Because it shares memory the way an integrated GPU would, yet has its own separate physical chip the way a discrete chip would? Although, how do we get around the second quote? The one that very specifically, boldly states that everything will be packed onto a single SoC?

Are you absolutely positive they will need separate dedicated GPU silicon to match the performance? I can definitely see the argument for the Mac Pro needing it. What's the possibility that they use one chip with CPU and GPU as an SoC, but create some sort of modular system that allows for the combination of many of those modules? Beyond my Computer Architecture course I never learned advanced specifics on how CPUs and GPUs and their memory systems function, exactly.
I think it's a mistake to read what Apple's released so far as if it's a perpetually binding agreement. It could be as simple as the initial wave of products being single SoCs and later ones being something else.

Yes, I am positive they will need dedicated GPU silicon on the Mac Pro. Look to today's Mac Pro as a guide. They offer up to 28-core Intel CPUs, and some pro app users do actually use that many cores, with appetite for more. (Watched a one-man soundtrack studio series of videos where he demoed a full virtual orchestra made up of hundreds of virtual instruments running on a new Mac Pro. Even audio can chew up a lot of CPU.)

Once you get into 16+ core territory, you're not going to want to put GPU on the same die. Too much heat, and the die gets extremely large, which affects yield. (About that, if you're not familiar with silicon manufacturing: Chips get cut out of a wafer. The manufacturing processes have some characteristic defect density, meaning the chance there's a defect in each mm^2 of wafer area. The larger the piece of the wafer you cut out and call one chip, the higher the chance it has a defect in it. Larger chips suffer a double penalty: you get fewer of them out of each wafer just because they're bigger, and more of them are scrap because they're bigger.)
 

chad.petree

macrumors 6502a
Original poster
Feb 2, 2013
568
259
Germany
My best guesses for redesigned MBP features over the MBA;

- Faster processor and graphics
- More USB 4 ports (4 vs. 1)
- Face ID
- P3 display
- Micro LED
- Potentially; Magic Keyboard rather than a redesigned butterfly keyboard for MBA.

Obviously as far as the current chassis goes, it’ll all be about the CPU and graphics.
So, the laptops have been announced and it's looking like this:

MBP features over the MBA;

-Maybe a 100 nit brighter screen, but are we even sure of that the macbook air now has p3 screen too, so they might be just using the macbook air's screen
-Maybe better sustained performance because of the cooling fan
-A bit better battery life

and that is it?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.