Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
While we debate endlessly about the future chips that may or may not bring three digit IPC increases, I took the liberty to interpolate the actual data from actual chips Geekerwan measured. Mind you: the interpolated curve goes through all the experimental points Geekerwan measured, it's not a loose fit. So while obviously the values in between are not accurate to the mW the qualitative shape of the actual performance curves can't look too different from this because this is how all performance curves look.

So for A14 Bionic, A15 Bionic, A16 Bionic and A17 Pro:

View attachment 2276940

And if we add the Snapdragon 8 Gen 2, the only other chip on Geekerwan's measurements:
Impressive range of the A17 Pro. Very useful in mobile devices with limited battery size. I wonder what will sit in the next iPad Pro: A17 Pro or a A17 Max with more GPU or an M3?
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
How do you come to this conclusion? For me it just means that the A17 consumes more energy to achieve the yearly performance delta (not ratio) that Apple has been targeting since the A11 or so.

Well, there are two interpretations I think. First one is that Apple was simply unable to improve the efficiency further and was forced to increase the power consumption to meet the performance goals. The other one is that Apple decided to abandon the 5W per core limit in order to push the performance further, but this meant giving up some of the efficiency potential. Both interpretations have their merit and I don't think the data prefers one over the other.

The big question is what segment of the power curve are we observing on the A17, and what are this curve's overall characteristics? For example, if you look at the graph I have posted, there is a big jump in power consumption between A14 and M1 (which use the same CPU core) for a modest frequency increase. This suggests that the A14 is clocked just below the segment where curve acceleration increases dramatically. Is this also the case for A17? We simply don't know. A17 power curve slope is steeper, which kind of suggests that we are already in that "exponential" range (A17 has the same dynamic power range but half the frequency range as A14). But at the same time it still looks rather linear all the way through (observe that with A14 the acceleration starts at around 2.5Ghz, while with A17 it's pretty much constant). And this observation leaves room for speculation that the A17 curve will extend some more to the right without hitting severe diminishing effects (speculation that I am eagerly jumping on). But of course, we don't know where the curve ends or where the acceleration really kicks in. I think it's not unreasonable to expect around 4.1 Ghz at ~ 8 watts or 4.3Ghz at ~ 10-12 watts, but that depends on whether A17 design has wider power range than the power-constricted A14.


The curves shown by @Andropov make it very clear.

I am reluctant to interpret multicore performance curves, because there are a lot of confounding factors (e.g. P/E core interaction). I think single-core makes for a simpler discussion.

This doesn't bode well for the Mac. There is less headroom to increase clock speed.

Macs have more thermal headroom, which is the entire basis for my argument. I'd rather have a 4.2 Ghz chip at 10 watt per core in my Mac than a 3.7 Ghz chip at 5 watts.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
A17/M3-series SoCs suck
Apple's CPU is in another league, but its GPU is worse than the competition.

1695455118826.png

1695455011496.png

1695454771780.png


What impresses me most is the floating point performance advantage that Apple's CPU has over the competition. I wonder if the compiler can have a big influence on the results.
SPEC CPU 2017 focuses on compute intensive performance, which means these benchmarks emphasize the performance of:

  • Processor - The CPU chip(s).
  • Memory - The memory hierarchy, including caches and main memory.
  • Compilers - C, C++, and Fortran compilers, including optimizers.
SPEC CPU 2017 intentionally depends on all three of the above - not just the processor.

By the way, has TSMC overstated the improvements of N3B?
 

MayaUser

macrumors 68040
Nov 22, 2021
3,178
7,199
Apple's CPU is in another league, but its GPU is worse than the competition.
How can we test this in a non-benchmark app to see if the gpu is worse ? I know people like to do a comparison in a demanding app and a lot of users are using games as a reference. I wonder if we can test this A17 with and top gpu android phone on the same game to see the different fps or something
Can we do that, or ask for that?
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
exponential

X704...? ;^p

Macs have more thermal headroom, which is the entire basis for my argument. I'd rather have a 4.2 Ghz chip at 10 watt per core in my Mac than a 3.7 Ghz chip at 5 watts.

4.20GHz M3 Ultra with hardware ray-tracing; yes, please...!

Hopefully we will also see a move to LPDDR5X RAM, with inline ECC; so 480GB maximum capacity with a M3 Ultra SoC, and a 1TB/s UMA bandwidth...?
 

komuh

macrumors regular
May 13, 2023
126
113
I don't know why people are discussing CPU performance (which is superb) and crying about potentially higher clocks, over GPU which is still lagging behind at least 2 years compared to Qualcomm or 3/4years compared to NV.
Even with new architecture and best node available on the market.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,461
955
Macs have more thermal headroom, which is the entire basis for my argument.

But this has also been true for the M1 and M2. Why should an A17 P-core scale to higher clock speeds this time? The data show that it consumes more power in an iPhone than the A16 at its nominal clock speed. If anything, the M2 should have been clocked much higher that the A16 (or is it based on A15?). If Apple didn't do this for the M2, I don't see why they could do this for the M3.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
But this has also been true for the M1 and M2. Why should an A17 P-core scale to higher clock speeds this time? The data show that it consumes more power in an iPhone than the A16 at its nominal clock speed. If anything, the M2 should have been clocked much higher that the A16 (or is it based on A15?). If Apple didn't do this for the M2, I don't see why they could do this for the M3.

Because M1 and M2 are physically unable to go above 5-6 watts. As you say yourself, if they could Apple would have clocked them higher on the desktop.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,461
955
Because M1 and M2 are physically unable to go above 5-6 watts.
Is it confirmed?

As you say yourself, if they could Apple would have clocked them higher on the desktop.
But how do you know that delta between A17 and M3 can be higher than between A16 and M2? Or are you simply saying that the same delta would make the M3 faster than other desktop CPUs?
 

Retskrad

macrumors regular
Apr 1, 2022
200
672
It doesn’t make sense why Apple would sacrifice the iPhone, which is carrying the entire company on its back (we don’t see people wait in lines at Apple stores for new Mac’s) just to prop of the M3 lineup of Mac’s. I have watched a lot reviewers and the consensus is the iPhone 15 Pro line up has average to poor battery life. I still hold the opinion that Apple has lost their Avenger level chip designers. The fact that they the A17 chip peaks like the M1 chip speaks loudly. Johny Srouji’s team hasn’t been able to progress since 2020. You know why? Because the iPhone has gotten worse battery life y/y.

i no longer view Apple as a leader in the CPU space. Hell… forget Nvidia. Apple can’t even compete with Qualcomm on the GPU front. Is it time to give Apple silicon group new leadership?
 

MayaUser

macrumors 68040
Nov 22, 2021
3,178
7,199
It doesn’t make sense why Apple would sacrifice the iPhone, which is carrying the entire company on its back
I dont think the iphone are sold based on performance gains...the market share is settle for couple of years...now you steal android users with "usbC, and ios features from android space"
But for the macs to market the new M3 as greater than ever since no redesign is coming for at least 2 more years...you have no other things but the toys and tools from the SoC to brag about it for sales increase
Just my opinion
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Is it confirmed?

What would serve as a confirmation for you? I think the fact that M2 Ultra is clocked only 120Mhz higher than an iPad chip speaks volumes about the operational parameters of the microarchitecture.

But how do you know that delta between A17 and M3 can be higher than between A16 and M2? Or are you simply saying that the same delta would make the M3 faster than other desktop CPUs?

I think it’s safe to assume that the delta will at least comparable. Already 300Mhz more would put M3 in the 14900k category (single core). If it can do more, even better.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
Adreno GPU totally sucks at GPGPU. Like 3X slower than the A7 or something. It may be faster in GFXBench, but not in general
Why does a phone GPU have to be good for GPGPU? It seems that Apple trying to make an SoC that will work for both a phone and a laptop may prevent them from getting the best performance in both situations. I expect Apple will end up splitting the development of the cores, one for phones and one for laptops.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Why does a phone GPU have to be good for GPGPU? It seems that Apple trying to make an SoC that will work for both a phone and a laptop may prevent them from getting the best performance in both situations. I expect Apple will end up splitting the development of the cores, one for phones and one for laptops.
Apple absolutely do not want to create multiple SoC SKUs. They are not in this business. Expecting Apple to beat everybody is not realistic.
 
  • Like
Reactions: huge_apple_fangirl

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
Impressive range of the A17 Pro. Very useful in mobile devices with limited battery size. I wonder what will sit in the next iPad Pro: A17 Pro or a A17 Max with more GPU or an M3?
I suspect the A16 Bionic and A15 Bionic curves can extend further to the left too, but we don't have those data points.

Why does a phone GPU have to be good for GPGPU? It seems that Apple trying to make an SoC that will work for both a phone and a laptop may prevent them from getting the best performance in both situations. I expect Apple will end up splitting the development of the cores, one for phones and one for laptops.
Rendering a frame can involve compute pipelines too. Having abysmal compute performance could result on some graphics techniques being unfeasible on the GPU, even if the GPU is able to rasterize many triangles very fast.

And that goes for many other GPU features too. Painting triangles is not everything there is. Many interesting GPU features are hidden behind optional Vulkan extensions, so I have no idea what the actual feature support of the non-Apple GPUs mentioned are, but if you don't support certain things the game is going to be limited in what kind of VFX it can provide.
Let's say for example a code used to generate dynamic shadows relies on supporting sparse tiled textures to provide high-quality shadows. If the GPU doesn't support sparse tiled textures you might have to fall back on a simpler shadowing technique, resulting in lower quality shadows.

I have no idea how feature support is accounted for in multiplatform benchmarks like GFXBench. Are the tests just using the lowest common denominator of supported features?
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
It doesn’t make sense why Apple would sacrifice the iPhone

They are not sacrificing anything. The iPhone sales won’t tank just because the CPU draws more power at peak load.


i no longer view Apple as a leader in the CPU space. Hell… forget Nvidia. Apple can’t even compete with Qualcomm on the GPU front. Is it time to give Apple silicon group new leadership?

You have some strange expectations… if Apple isn’t a leader in the CPU space, who is? ARM doesn’t even get close in performance. Neither Intel nor AMD get close in efficiency.

As to the GPU, this has already been explained multiple times. You’d prefer Apple to make GPUs that cut corners in order to win benchmarks? In the real world , when running real software A17 appears be be 40% more efficient than Qualcomm. And let’s not forget that iPhone games usually have more visual fidelity than Android ones.


1695465487882.jpeg
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Why does a phone GPU have to be good for GPGPU? It seems that Apple trying to make an SoC that will work for both a phone and a laptop may prevent them from getting the best performance in both situations. I expect Apple will end up splitting the development of the cores, one for phones and one for laptops.

I think it’s more important to Apple to have consistent hardware behavior and features across platforms rather than spend money building a compromised version of their chip to win benchmarks. They care about applications and there they seem to have a good lead.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
graphic @Xiao_Xi cited clearly says “Vulcan API” on the “CPU Workload” benchmark
They made a mistake with the title. The graphic in Chinese clearly says GPU.

1695467677554.png


I hope they can get help next time and make a better benchmark for the NPU.

1695467804748.png
 

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
Why does a phone GPU have to be good for GPGPU? It seems that Apple trying to make an SoC that will work for both a phone and a laptop may prevent them from getting the best performance in both situations. I expect Apple will end up splitting the development of the cores, one for phones and one for laptops.
Another example I just remembered. In Shadow of the Tomb Raider, there's a fluid dynamics simulation running when there's water visible on the game, to model the character's interaction with floating substances (algae, oil...) on top of the water (the developers have a full video on this topic):

SotTR.jpg


This is run as a series of compute shaders, and the developers said that the effect adds ~0.5ms of a frame on an Xbox One (for reference, to reach 60fps, the total frame time must be under 16.6ms).

As far as I know, the way it usually works when developing a game (at least for consoles) is that certain visual effects are given a frame time budget. In order to cram that effect into the game, it must add less than X milliseconds to the total frame time. For PC games, which must support a wide range of GPUs, I believe non-essential visual effects can be toggled off based on settings.

On mobile phones, the game is likely to be tested in several different phones of different price ranges. If the GPU's compute sucks in all of them, it's not unreasonable to think that non-essential VFXs that are heavily dependent on compute are likely to be dropped altogether.

For AAA games, where there are R&D departments trying to maximize every bit of power of the GPU, you're way more likely to find exotic algorithms and implementations that depend in the GPU supporting specific characteristics. So it could be that, in order to support AAA games (and Apple just announced a bunch on the iPhone 15 keynote), the GPU needs to have extensive feature support *on top of* being fast.
 
  • Like
Reactions: souko and Xiao_Xi

komuh

macrumors regular
May 13, 2023
126
113
They are not sacrificing anything. The iPhone sales won’t tank just because the CPU draws more power at peak load.




You have some strange expectations… if Apple isn’t a leader in the CPU space, who is? ARM doesn’t even get close in performance. Neither Intel nor AMD get close in efficiency.

As to the GPU, this has already been explained multiple times. You’d prefer Apple to make GPUs that cut corners in order to win benchmarks? In the real world , when running real software A17 appears be be 40% more efficient than Qualcomm. And let’s not forget that iPhone games usually have more visual fidelity than Android ones.


View attachment 2277814
But this is whole package usage, apple CPU's are like 3 generation ahead and games are almost always more optimised for iOS it does not say as much as we would like to about GPU performance.

As for GPU comparison we can compare M-series GPU to other desktop competition and apple is indeed lagging hard compared to NV (ofc. they are also just evolving from mobile gpu space which is for sure a lot less demanding) in both features and software [we can argue about hardware as its totally different node and use-case].
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
As for GPU comparison we can compare M-series GPU to other desktop competition and apple is indeed lagging hard compared to NV (ofc. they are also just evolving from mobile gpu space which is for sure a lot less demanding) in both features and software [we can argue about hardware as its totally different node and use-case].

They are far behind in performance (especially on the desktop, where Nvidia has the clock advantage). Feature-wise, Apple is behind but catching up fast. We still don't know what new stuff A17 brings to the table.
 

Retskrad

macrumors regular
Apr 1, 2022
200
672
Here’s what I think is happening with Apple‘a marketing and silicon division: the marketing people says they need more power to keep up on laptops and desktops, in terms of GPU and CPU, and because marketing is probably the most powerful department right there with design, they forced the chip division to dramatically increase peak performance so the M-line of CPU’s can keep up with Intel and AMD desktop chips.

At this point it’s apparent that the silicon division has lost most of their best chip designers and they havent been able to build a new microarchitecture after A14. That’s why the balance of higher performance and efficiency so dramatically off with A17. Apple’s chip division is literally not able to make efficient silicon anymore. New iPads, Macs, iPhones, Watches: not a single one of them has increased in a battery life since 2020 which coincides with the time where their best chip designers left.

Apple’s silicon division pre A14/M1 was so praised because what they did was extremely difficult. Now that these talented chip designers are gone, Apple’s chip division is not special anymore. This is evident by the fact that the A17 is still based on the same microarchitecture as A14 but on a smaller node.
 

komuh

macrumors regular
May 13, 2023
126
113
They made a mistake with the title. The graphic in Chinese clearly says GPU.

View attachment 2277845

I hope they can get help next time and make a better benchmark for the NPU.

View attachment 2277848
NPU is as useful as apple allows it, working with ANE for past 2 years I found that its only useful in the cases that you are limiting yourself to only small inputs and smallish model sizes (in which case weight compression can lose ton of accuracy) or heavy compressed weights on bigger models (which also lose quality).

I hope they do some bigger update in this generation cause TFLOPS even on M1 series are extremely hard to utilise (limiting factor is memory transfer speed and "cache") so if they just double the FLOPS without proper memory speed increase it shouldn't change much for 99% use cases.

Even Apple blog-post about NPU they are barely able to utilise hardware with fixed smaller size inputs with heavy optimisation and weight compression (with ofc. loss of quality) and possible with some low-level internal framework level optimisation.

Also I'm not sure how much NPU utilisation is even in this Stable Diffusion test as wattage for A15 seems to be bigger than NPU wattage I ever hit in my Mac ultra (tops about 3.0-3.2W on Apple provided stable diffusion), its most likely just higher GPU/CPU clocks or on chip memory that change the time slightly [maybe allowing for faster data fetching and using extra few W from NPU? would love to get my hands on this test after M3 comes out].
 
  • Like
Reactions: tenthousandthings
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.