Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Adult80HD

macrumors 6502a
Nov 19, 2019
701
837
Dear Apple,

Pls release a larger than iMac 24" M3 by Q1 2024.

I just want to retire this dozen year old iMac 27".
They seem to have been pretty clear that's not going to happen. The new 27" "iMac" is the Studio or Mini paired with a Studio Display. Really smarter anyway, you can probably get a solid 10+ years out of a Studio Display in which time you'll want to upgrade your Mac at least once. Well most people will, you seem to be an exception.
 
  • Like
Reactions: Tagbert

name99

macrumors 68020
Jun 21, 2004
2,410
2,317
Precisely. The worries appear to be based around the arguments like "but they used to boast with 20% better performance in the past!", and while I sympathise with those posters, these are not good arguments. Times change, as do design goals.

For example, Apple has very little incentive to make the new phone CPU 20% faster (they are already years ahead of the competition), but they have incentive to make Mac chips faster. Maybe this new CPU is designed with higher power draw and higher clocks in mind (unlike A14/A15 that pretty much peaked at 5W per core max). Then Apple would need to clock it more conservatively in the iPhone, so that they can meet their power targets.

Of course, what I wrote above is just a speculation, and might as well be baseless, but I hope it illustrates that there can be plenty explanations for only modest performance improvements in A17, not all of them equally gloomy. We will know more when we have more data bout the chip (performance, clocks, features, etc.). Then we will have at least something to speculate about Apple's inevitable impending doom.

Another way to look at it is that scoring the annual performance increase by percentage is misleading.
If you look at the iPhone annual increase, for example here:
53459-107334-Screenshot-2023-03-14-at-105216-xl.jpg


Or here:
ynzdu52skxn71.png


The striking fact (IMHO) is that single core performance is on a linear, rather than exponential, curve.
This COULD reflect something about engineering, but it could also reflect that Apple have deliberately chosen this according to some internal benchmark set ("we will strive for an increase of 100 Applemarks per year") and that's what they implement, scaling various structures so as best to achieve that.

Obviously an increase of ~250 per year seems more impressive by percentage if you are at the 1700 level than if you are at the 2700 level; but if we assume this model then a 10% increase (ie about 250 points) is pretty much exactly what we should have expected. (Along with maybe about 80..100 points increase in MP).

Now are they scaling M cores in the same way; or are they more willing to add support hardware+energy (eg larger branch predictors) for higher performance? I don't think we yet have enough data to say.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,674
The striking fact (IMHO) is that single core performance is on a linear, rather than exponential, curve.

Very nice catch! Quick note on the first graph: the leaked A17 be chameleon are likely fake (wrong chip name and frequency too high IMO). But 10% over A16 is again precisely 250 points, resulting in a striking trend since at least A13.
 

name99

macrumors 68020
Jun 21, 2004
2,410
2,317
Not true. Apple shows the improved CPU and battery life on the website.

View attachment 2261007
View attachment 2261020


You should be more respectful of elected politicians. Just because they don't represent you doesn't mean they don't represent others.
Look at what I SAID about battery life!
"
Battery life is now "problematically" correlated with better CPU/GPU efficiency. Look at what Apple tells you on the Tech Specs web page:
Video playback (in both versions) depends on screen, media block, and network – NOT on CPU or GPU. Same for Audio playback.
They matter (insofar as they are useful info to people, eg going on a flight) but they are NOT for example
- idle time (how long does your phone last with minimal use?)
- photo time (on vacation, taking photos every few minutes)
- game time (heavy CPU and GPU usage)
- web browsing time (mostly heavy CPU usage)
- substantial dictation time (lots of NPU [and maybe CPU and GPU?] usage)
"

You don't disprove my point by giving EXACTLY the data that I said was not relevant!!!

And, come on, you want me to respect the same politicians who gave us stupid GDPR notices on every web page and insist on USB-C? What exactly do you imagine the legal update process will be when USB-C starts showing its age (eg to be replaced by an optical connector for data)?...
Maybe the EU should concentrate on getting their story straight for a single power plug connector across the whole continent before bothering with less important issues???
 

name99

macrumors 68020
Jun 21, 2004
2,410
2,317
Well Apple lost a bunch of top chip designers a few years ago. That could just start to be showing up now. Also WFH probably lowered productivity of the chip team that also wouldn't show up right away. And process improvements on the manufacturing side have massively slowed down and 3nm isn't a huge improvement.

In retrospect, Apple really chose the absolute best time to ditch Intel. The gap between them and Intel in late 2020 in both design and process (Firestorm vs Tiger Lake, TSMC 5nm vs Intel 10nm (now Intel 7)) is probably the largest it's ever going to be. IPC improvements on Apple's side have massively slowed down since then, as has TSMC's process advancements.

Apple has also gained a bunch of top chip designers over the same period.
Those that went to Nuvia (the most publicized group) don't seem to be doing anything spectacular, to judge from the multiple reports of problems at Qualcomm.

I suspect that the engineering has (as always) become more difficult. You can make wild leaps into the unknown (as Nuvia apparently tried to do; and as Intel seems to occasionally do) but what's your plan B if the leap fails?
Apple seem to be mentally more aligned with TSMC – slow steady cautious progress every year, and if that doesn't get the excited press of wild leaps, well, WHO CARES? Press is not actually reality...

Apple could, for example, probably have added a lot more functionality to the M1 Ultra to make it scale better. But the result might have been an M1 Ultra that just didn't work... Instead they shipped an M1 Ultra that was safe; and learned from all the parts that did work so that they could correct the parts that didn't work, with the M2.
There's always next year to try the ideas that are too risky this year --- EXCEPT if you crash and burn the company.
 

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
They specifically called out on a slide that there are changes to branch prediction (and we have a pretty good idea what those are) and increases to decode and execution width.
Point is not that these are the only changes, but that these are changes that you don't get from "just a die shrink".


Battery life is now "problematically" correlated with better CPU/GPU efficiency. Look at what Apple tells you on the Tech Specs web page:
Video playback (in both versions) depends on screen, media block, and network – NOT on CPU or GPU. Same for Audio playback.
They matter (insofar as they are useful info to people, eg going on a flight) but they are NOT for example
- idle time (how long does your phone last with minimal use?)
- photo time (on vacation, taking photos every few minutes)
- game time (heavy CPU and GPU usage)
- web browsing time (mostly heavy CPU usage)
- substantial dictation time (lots of NPU [and maybe CPU and GPU?] usage)

etc etc
I suspect Apple has stopped talking about battery lifetime not because it's not important but because there's no way to do so that's a win for them. They can talk up how one use case is much improved (say dictation) and there's a whole crowd on the internet that will immediately say "Apple lies about battery life" because some very different use case (playing video?) is unchanged...
I’m not sure I’m on board with this, Apples (honestly every companies) marketing pretty much always has used “best case scenario” for their marketing numbers.

Sure they might be “underpromising and over delivering “ but I think it’s far more likely that the 10% improvement is just that.

Besides, if looking around the forums is any indication, single thread performance isn’t the biggest gripe, gpu performance is. And if the keynote is anything to go by, they’ve made big strides there.

I think, besides a handful of people who expect the best or nothing, that the performance of the M3 will be good.
 

sack_peak

Suspended
Sep 3, 2023
1,020
959
They seem to have been pretty clear that's not going to happen. The new 27" "iMac" is the Studio or Mini paired with a Studio Display. Really smarter anyway, you can probably get a solid 10+ years out of a Studio Display in which time you'll want to upgrade your Mac at least once. Well most people will, you seem to be an exception.
Zero interest to replace the Mac sooner than 10 years when the final macOS Security Update is released.

If there was macOS on iPhone 15 then there is more incentive to replace it every 2 years.
 

Jamie I

macrumors newbie
Jan 16, 2023
19
67
Apparently this is an A17 Pro leak on Geekbench 6 but I can’t confirm if it is real

 

Attachments

  • IMG_1764.jpeg
    IMG_1764.jpeg
    124.8 KB · Views: 71

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
There is a collective agreement that Apple SoCs are leading in CPU and lacking in GPU. Since the iPhone and other Ax devices so not appear to be hampered by CPU, it is only logical that Apple increases GPU/NPU performance relatively more than the CPU. Any SoC is a compromise and Apple is only balancing the A17 pro towards photography and gaming. On the whole, a wise decision given it sits in a phone (later iPads).

Maybe wait to complain until the M3 hits the streets but before you complain as yourself is current M2 devices are CPU or GPU constrained.
 

TigeRick

macrumors regular
Oct 20, 2012
144
153
Malaysia
First leaks of GB 6.2 on A17 Pro, even MT also within clock speed improvement....kind of hoping bigger L2 cache is gone :mad:

A17Pro.png
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
I'm not a hardware designer. I presume that hardware designers need to work together more and have better coordination and have to use specialized testing tools that are harder to get WFH.

I'm a software engineer. The software quality and communication definitely suffered during the WFH era for me and my company.
I'm a logic designer who has a reasonable (for a HW designer) understanding of software. I would say the opposite: logic designers tend to need less day-to-day coordination. "Waterfall" methodology is the norm, not an archaic exception, so there's much less refactoring. (If there is a lot of it, someone in the architecture team for that chip screwed up.)

re: tools, we tend not to run them on personal machines. Before COVID-19, I was 100% in the office, but I was already VNC'ing from the company provided personal laptop to a big rackmount server Linux box where all my design files and tools live. When WFH started, the only change was needing to make a VPN connection before starting the VNC client.

I'm not saying that everything's perfect, but you can get a lot done. If your home environment doesn't have distractions it can even be a productivity bonus. Hardware designers are often trying to concentrate on something tricky, so a quiet no-interruptions environment is great.

(I've heard that Apple's chip design team worked out a deal to mostly not be located in the big spaceship building, despite what the recent keynote films imply. The spaceship is a trendy open office plan hellhole. I think that's a bad idea for everyone, but it's even worse for hardware engineers, who skew towards the personality type that's easily bothered by constant background noise and visual distractions.)
 

bobmans

macrumors 6502a
Feb 7, 2020
598
1,751
So the improvements are just from clock speed?
Yes, exactly what I was thinking.

Also check out this, A17 Pro cores are commented as H15 Coll with H15 being the A16 cores. They’d be commented as H16 if they were any different.
IMG_5259.jpeg



The A17 Pro CPU really seems to be nothing more than a clock speed boost that eats away the efficiency gains of the die shrink to 3nm. Explains why battery life stayed the same as well.
 

AgentMcGeek

macrumors 6502
Jan 18, 2016
374
305
London, UK
Yes, exactly what I was thinking.

Also check out this, A17 Pro cores are commented as H15 Coll with H15 being the A16 cores. They’d be commented as H16 if they were any different.

The A17 Pro CPU really seems to be nothing more than a clock speed boost that eats away the efficiency gains of the die shrink to 3nm. Explains why battery life stayed the same as well.

Let's brainstorm alternative explanations. It might also be that the power savings from the chip are eaten away by the brighter screen or a potentially smaller battery. What else?

Hardware
- The additional GPU core might be detrimental to battery life. That is the case on the M1 Max vs the M1 and M1 Pro.
- I don't expect I/O, the AV1 decoder, or higher RAM to be significant to battery life, but you never know.

Software
- More intensive use of the NPU for ML tasks, for instance through on-device Siri request processing.

Other
- Non-hardware or software explanation: Apple is conservative on battery estimations.

I guess we'll see with independent testing.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,521
19,674
Yes, exactly what I was thinking.

Also check out this, A17 Pro cores are commented as H15 Coll with H15 being the A16 cores. They’d be commented as H16 if they were any different.
View attachment 2261435


The A17 Pro CPU really seems to be nothing more than a clock speed boost that eats away the efficiency gains of the die shrink to 3nm. Explains why battery life stayed the same as well.

Yeah, looks weird. I’m surprised, to say the least.

Where are those identifiers defined?
 

Natrium

macrumors regular
Aug 7, 2021
125
246
Honestly it’s a bit disappointing. The 20% faster gpu coincides with the extra gpu core (6 vs 5 gpu cores). The 10% faster cpu coincides with 10% higher clock speed. The upside is the battery life hasn’t decreased, but we don’t know yet if this is achieved (partly) due to a larger battery.
 
  • Like
Reactions: Darajavahus

leman

macrumors Core
Oct 14, 2008
19,521
19,674
The initial GB scores show that some tests only scale with the frequency, while others show much more substantial improvements. In a few tests we see an improvement of almost 30%! So if these are legit tests, we are definitely observing a new microarchitecture, but it's not clear to me what exactly has been improved.


P.S. One thing though — all branchy tests (including clang) show improvements larger than clock increase. The new branch predictor probably accounts for at least 2-3% improved performance, which is fairly good result in the industry, especially given how good Apple's branch prediction already was. But I am not sure whether I see new execution units? Maybe there is an additional floating-point unit?
 

Mac_fan75

macrumors member
Jun 1, 2023
66
95
Looks a bit disappointing for the CPU side, but perhaps this thing can be clocked way higher in a Mac. They probably had their focus on GPU for this year so next year the big CPU bump. And the rumour was this GPU was supposed to be in the M2 but withdrawn because of high power usage, so now clocked very conservative and hopefully way higher (but at the cost of power) in the Mac.
 

T'hain Esh Kelch

macrumors 603
Aug 5, 2001
6,474
7,408
Denmark
So the A17 Pro SC and MC scores are up 10% due to the increase in frequency, the GPU is up 20% due to one more GPU core (6 over 5), and the battery size is increased yet runtime is the same. What exactly is the jump to N3 doing here? It is likely way more expensive for Apple, and thus a step back.
 
  • Like
Reactions: iPadified

sack_peak

Suspended
Sep 3, 2023
1,020
959
I cannot wait for Q1 2024 to come so we can get M3 refresh of

- MBA 13"/15"
- MBP 13"
- iMac 24"/32"
- iPad Pro
- Vision Pro
- Mac mini

I'd be jumping from a 22nm to 3nm chip.
 
  • Like
Reactions: mr_jomo
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.