Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
A17 cores have been designed for desktop use first and foremost as the gains over 5N products improve the higher we go

yeah it really seems like it, as they try to get all socs in line, the iphone has to suffer from this focus shift, since phones' focus should be on battery and E cores, and lower clocks. errr
 
Thank you for this interesting thread and your work, @leman . As the OP of the “A17 Pro” thread, I’ll link to this one for more technical in-depth information.

No efficiency gains from A16 to A17... but I think it's much more interesting to compare A17 to M2. A16 was a one-time design aimed at optimising performance and efficiency in a mobile phone, and Apple tweaked the structure sizes and used an optimised N4 node to get there. I am getting more and more convinced that A17 P-cores instead are developed for the desktop and essentially continue where A16 stopped, but with a wider frequency range in mind.

Interesting, the M2 power consumption is 1.5W higher than the A17 Pro (as an M2 iPad Pro owner, I can confirm that it is more power hungry than the M1).

However, if the A16 cores and A17 cores behave so similarly (at least the P-Cores), do you think Apple might be reusing the same “4nm” process for the A17 Pro CPU, and only the GPU (and maybe the Neural Engine) were made using the new 3nm process?

I know this sounds crazy, but due to the reported catastrophically low yields of this N3B process, maybe Apple is using it only for some parts of the SoC.
 
yeah it really seems like it, as they try to get all socs in line, the iphone has to suffer from this focus shift, since phones' focus should be on battery and E cores, and lower clocks. errr

I don’t really see the phones suffering from it. At most, you’d get a bit more performance for quick work or slightly warmer phone under load. But averaged CPU power consumption is about the same. Battery life didn’t change either, the few minutes in that or other direction given by various reviewers do not seem to show a significant difference.
 
  • Like
Reactions: teh_hunterer
Interesting, the M2 power consumption is 1.5W higher than the A17 Pro (as an M2 iPad Pro owner, I can confirm that it is more power hungry than the M1).

However, if the A16 cores and A17 cores behave so similarly (at least the P-Cores), do you think Apple might be reusing the same “4nm” process for the A17 Pro CPU, and only the GPU (and maybe the Neural Engine) were made using the new 3nm process?

I know this sounds crazy, but due to the reported catastrophically low yields of this N3B process, maybe Apple is using it only for some parts of the SoC.

I don’t think it works that way. Also, there are significant performance differences for some SPEC subtest showing that A17 indeed has more integer execution units. You won’t see this in my test since I run a dumb loop with a divide instruction. Would be interesting to use a more complex workload but I don’t know what would be a suitable one without exploding the app. Maybe some random number generation or crypto.
 
  • Like
Reactions: Quan2000
Impressive work. So we are expecting 5Ghz at around 15W? Wouldn’t there be some diminishing returns in perf-per-watt after a certain point though? The end of that curve feels a tad optimistic to my non-expert eyes.
 
  • Like
Reactions: Populus
Who reported that and what’s the source?
MacRumors during the months prior to the iPhone 15 Pro launch. Low yield rates have been reported every now and then, that’s why TSMC reached an agreement with Apple for them to use this novel process for one year at discounted prices.

N3B was kinda like a “failed experiment” not suitable for large scale manufacturing, while N3E will be the next mainstream node, which scales pretty well.
 
  • Like
Reactions: Allen_Wentz
@leman thank you for the very insightful post.

It is much appreciated.

I look forward to it being translated to M3 & M3 Pro chips next year.
 
MacRumors during the months prior to the iPhone 15 Pro launch. Low yield rates have been reported every now and then, that’s why TSMC reached an agreement with Apple for them to use this novel process for one year at discounted prices.

N3B was kinda like a “failed experiment” not suitable for large scale manufacturing, while N3E will be the next mainstream node, which scales pretty well.
Can you or MR cite a source?
 
I don’t really see the phones suffering from it. At most, you’d get a bit more performance for quick work or slightly warmer phone under load. But averaged CPU power consumption is about the same. Battery life didn’t change either, the few minutes in that or other direction given by various reviewers do not seem to show a significant difference.
hmm that's interesting. i can attest my 15 pro has roughly 70% of s23's battery life

granted the 14 pro wasn't that good. 13 pro (a15) was apparently very good

wait geekerwan review and tests clearly point out to a worse battery life , with mixed usage
 
hmm that's interesting. i can attest my 15 pro has roughly 70% of s23's battery life

Well, S23 battery is also considerably larger and brighter display, so I am not surprised.

granted the 14 pro wasn't that good. 13 pro (a15) was apparently very good

And yet many reviews say that 14 pro was as good or better as 13 pro.

wait geekerwan review and tests clearly point out to a worse battery life , with mixed usage

And other reviews have other results. There are so many factors to all this... will the iPhone 15 Pro last less if you run CPU-demanding work on it all the time? You bet. Will it last less while gaming? Doesn't really seem so, as it's average power consumption was lower than other phones. Other applications? Who knows. I don't think it makes sence cherry-picking 15 minutes this or other direction. The differences in battery life might as well come down to the new model/WiFi module.
 
Well, S23 battery is also considerably larger and brighter display, so I am not surprised.



And yet many reviews say that 14 pro was as good or better as 13 pro.



And other reviews have other results. There are so many factors to all this... will the iPhone 15 Pro last less if you run CPU-demanding work on it all the time? You bet. Will it last less while gaming? Doesn't really seem so, as it's average power consumption was lower than other phones. Other applications? Who knows. I don't think it makes sence cherry-picking 15 minutes this or other direction. The differences in battery life might as well come down to the new model/WiFi module.
s23 display is 1750nits so nearly as bright, but yeah...sad apple couldnt give a bigger battery

what reviews point out to 14 pro being better ? genuine question, most stuff ive seen pointed out to worse batt perf

i haven't seen any reliable reviewer/site say the 15 pro has better battery than the 14 either . yeah gaming has better battery from what i'm gathering
 
Impressive work. So we are expecting 5Ghz at around 15W? Wouldn’t there be some diminishing returns in perf-per-watt after a certain point though? The end of that curve feels a tad optimistic to my non-expert eyes.

I don't really expect it to reach 5Ghz. At some point the circuit just stops working. Or maybe one needs to increase the voltage much higher to get it work, which would change the curve. Again, treat all of this with a big grain of salt. It might all just be a fluke.
 
what reviews point out to 14 pro being better ? genuine question, most stuff ive seen pointed out to worse batt perf

Well, for example




i haven't seen any reliable reviewer/site say the 15 pro has better battery than the 14 either

I haven't seen any reliable reviewer say that it is worse. Toms hardware says 15 Pro is better but their methodology appears to be crap. And so on. You can find the result you want if you look long enough.

BTW, don't blindly believe Geekerwan either. They seem to have run most of the tests with active cooling on the phone. Which will obviously make A17 looks very inefficient since it has a higher frequency ceiling. In practice however it doesn't reach the power consumption figures they claim, not for more that few seconds at least. I never got the 15W multicore either.
 
Impressive work. So we are expecting 5Ghz at around 15W? Wouldn’t there be some diminishing returns in perf-per-watt after a certain point though? The end of that curve feels a tad optimistic to my non-expert eyes.

By the way, to put this into perspective, here is the predicted curve for A15 using the same method as I used for the A17, with M2 data overlayed on top of it. Observe how the A15 data offers a reasonable prediction for M2 and how much faster this curve climbs than the A17 one (provided in grey for comparison) once we get behind 3.5 ghz?

Of course, none of this is exact science and the curves I have computed are not guaranteed to be the best or even good fit. Maybe the actual curve for A15 is less, and the one for A17 is more steep. These are just impressionistic comparisons, and we will only know for sure how accurate this is when the M3 series are out.


1695848283746.png
 

Attachments

  • 1695819923098.png
    1695819923098.png
    89.6 KB · Views: 165
Last edited:
Awesome work!

So is it accurate to say that Apple’s redesign work on the CPU cores were to enable it to reach higher clock speeds?
 
That's my current hypothesis, yes. Whether it's accurate, no idea. I wouldn't bet my favourite pair of jeans on it, for example :)
I probably won’t repeat it, but it makes the most sense to me. Do you think it could also be related to their stated GPU improvements?

i.e. the old core design was bottlenecking the gpu cores.
 
Do you think it could also be related to their stated GPU improvements?

i.e. the old core design was bottlenecking the gpu cores.

No idea, I didn't do any investigation into the GPU yet (I kind of hope that more capable people than me will do it, cause I am already spending too much time on this stuff as is). The weak points of previous Apple GPUs are pretty much known (small L2 cache, smaller register size than some competitors, somewhat lacking atomic support, lower frequencies and number of cores). We don't know yet what the new GPU does to address these exactly. I was hoping that Apple would release the Metal tech notes for A17 by now, but it doesn't seem like they are going to. Nothing new in the metal shading language either. Maybe we'll have to wait until M3...
 
  • Like
Reactions: caribbeanblue
So, Apple is quietly switching silicon design focus from mobile to desktop now? That’s a very interesting trend indeed, and as long as they can maintain their nominal battery life on iPhone, better performance is always welcome.
I don’t think its switching focus from one to another. It’s Apple expanding to desktop, instead of just focusing on mobile. Apple has more room for GPU modifications.
 
  • Like
Reactions: Allen_Wentz
And here is a fitted prediction curve (using fourth degree polynomial, going higher doesn't improv the fit) for A17

View attachment 2282345

Take this with a grain of salt obviously, but given how good the fit is, I'd say we have even more evidence that A17 is designed with desktop use in mind. I think 4.5 Ghz at 10 watts should at least be achievable, and that would give desktop M3 GB6 single of > 3600

P.S. Looking at this graph I can't wonder whether there is some intent behind some crucial points. It it a coincidence that A17 Pro is clocked to peak exactly at 5Ghz, or that the curve hits 4.5 Ghx exacly at 10 watts and 5Ghx at approx 15 watts? These are all important psychological points.
May be push ultra to 5 Ghz and cap Pro/Max at 4.5 GHz.
 
Can you or MR cite a source?
When we talk about rumors or leaks, it’s kinda difficult to point towards an official source, because obviously neither Apple nor TSMC are going to make those internal details official. You’d have to trust the leakers that periodically report those details that later blogs like MacRumors or 9to5mac use to write their articles.
 
I don’t think its switching focus from one to another. It’s Apple expanding to desktop, instead of just focusing on mobile. Apple has more room for GPU modifications.

Yep, it makes more sense to scale your desktop tech down to the mobile needs (assuming you can deliver the needed power efficiency of course) than to scale your mobile tech to desktop needs (where you will quickly run out of steam as Apple did with A14...)
 
  • Like
Reactions: Chuckeee
By the way, to put this into perspective, here is the predicted curve for A15 using the same method as I used for the A17, with M2 data overlayed on top of it. Observe how the A15 data offers a reasonable prediction for M2 and how much faster this curve climbs than the A17 one (provided in grey for comparison) once we get behind 3.5 ghz?

Of course, none of this is exact science and the curves I have computed are not guaranteed to be the best or even good fit. Maybe the actual curve for A15 is less, and the one for A17 is more steep. These are just impressionistic comparisons, and we will only know for sure how accurate this is when the M3 series are out.


View attachment 2282500

This is super cool. But what I mean is that these projections could theoretically go on forever. In reality we know that it does not scale indefinitely.
If we control for the TDP by keeping it constant in our comparisons, what do you reckon would an M3 or M3 Max CPU top at in terms of frequency? If I read your graph properly, at 10W per thread we should expect at least 4.5Ghz?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.