Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TigeRick

macrumors regular
Original poster
Oct 20, 2012
144
153
Malaysia
Question: Have you wonder why Apple rumored to launch M4 series by the end of the year? This round Apple will launch M4 from top to bottom including Mac Mini within a quarter period, why?

Mark Gurman can't tell you, but I can. The key ingredient missing is LPDDR6. :cool:

Apple will reset whole Mac lineup with LPDDR6 to stay competitive. Next year, we should have at least 4 more players entering ARM PC market with Cortex-X5, which has faster IPC than M3.


Before I explained why upcoming M4 series will support LPDDR6 standard and raise the maximum amount of RAM up to 512GB. Let me show you the current LPDDR5 RAM configuration so that you guys can prepare for what is coming later this year:


Memory Density 16Gb (2GB)M3M3 ProM3 MaxM3 Max
LPDDR5 Memory Bus128-bit192-bit384-bit512-bit
Memory BW100 GB/s150 GB/s308 GB/s410 GB/s
Memory Chips2 pcs3 pcs3 pcs4 pcs
x2 = 4GB (S: 32-Gb)8 GB
x3 = 6GB (M: 48-Gb)18 GB
x4 = 8GB (S: 64-Gb)16 GB
x6 = 12GB (S: 96-Gb)24 GB36 GB36 GB48 GB
x8 = 16GB (S: 128-Gb)64 GB
x16 = 32GB (256-Gb)96 GB128 GB

  • Apple has reduced the memory bus of M3 Pro from 256-bit to 192-bit. You may ask why? Cost cutting? Nah, Apple actually increased the RAM from 16GB to 18GB by ordering 48-Gb memory chips from Micron.
  • Apple could use two 48-Gb to make 12GB standard in M3 but Apple didn't. Why? One reason is Apple is waiting for LPDDR6, another reason is the increment of 12-16-24 LPDDR5 is not linear.
  • M3 Max is a different SoC with 128-bit memory bus connecting to each memory chip compared to 64-bit of M3 and M3 Pro. That's why Apple able to support 512-bit memory bus with 4 memory chips only.

Apple's LPDDR6 Solution

Apple-M3-chip-series-unified-memory-architecture-M3-231030_big.jpg.large.jpg


Shown above is the die shot of M3 SoC from Apple. M3 is connected to 128-bit LPDDR5-6400 with 102GB/s bandwidth.

LPDDR6 Standard: 24 bits channel width

Apple going to implement something weird yet make sense if you understand the logic behind. For upcoming M4 series, Apple going to introduce 96-bit memory bus of LPDDR6 per channel as shown below:-

Memory Density 32Gb (4GB)M4+ ?M4 ProM4 MaxM4 Max
LPDDR6 Memory Bus96-bit192-bit288-bit384-bit
Memory BW? GB/s? GB/s? GB/s? GB/s
+ %
Memory Chips1 pc2 pcs3 pcs4 pcs
x3 = 12GB12 GB24 GB36 GB48 GB
x5 = 20GB20 GB40 GB80 GB
x7 = 28GB28 GB56 GB
x8 = 32GB96 GB128 GB
x12 = 48GB192 GB
x16 = 64GB256 GB

First of all, you have to understand Samsung, the biggest memory maker is making 24Gb (3GB) die as standard LPDDR6 die. Therefore, all the LPDDR6 memory chip will contain multiple layers of memory die, the most common one is x4 equal to 12GB.


Updates: Mark Gurman mentioned that M4 series going to support up to 512GB, it seems Apple goes even bigger for LPDDR6. They are going to ask Samsung to manufacture 32Gb (4GB) per die. I have updated the table to show new memory size.





The Advantages of LPDDR6

DDR6.png


SamsungCost.jpg


The roadmap stated that LPDDR6 will be available in 2026, not by the end of 2024. Then why do I think Apple will bundle LPDDR6 by the end of 2024? Cause the standard will be certified by Q3 this year. And Samsung has been manufactured 24Gb and 6.4 Gbps memory die using EUV since 2023. Samsung also is making 32Gb memory die in 2024.

Unless Mark Gurman's leaks about 512GB memory size is wrong and Mac Mini will not be updated within the same timeframe, then Apple is gearing up the super cycle of Mac lineup.

Do you think Apple among the first to release MacBook Pro with 512-bit LPDDR5 don't know the advantages of LPDDR6?

If my speculated RAM configuration is correct, Apple will be working closely with Samsung to build 32Gb memory die with 96-bit memory bus.

Mac volume is relatively small compared to iPhone sales volume. That's why I suspect Apple might reserve LPDDR6 for Pro lineup. And most likely using 24Gb memory die which has been manufactured since 2023.

PS: There is already news about upcoming Qualcomm's 8G4 using LPDDR6 by the end of this year. Do you think Apple will miss the biggest change in RAM upgrade?


Speculated RAMMemory ChipsMemory SizeMemory Die
A17 Pro18 GB4 (2GB x 4)
A18 Pro18 GB4 (2GB x 4)
+ %SameSameSame
M328 GB4 (2GB x 4)
M4+ ?112 GB3 (4GB x 3)
+ %- 50%+ 50%- 25%
M3 Pro318 GB9 (2GB x 9)
M4 Pro224 GB6 (4GB x 6)
+ %- 33%+ 33%- 33%
M3 Max448 GB24 (2GB x 24)
M4 Max448 GB12 (4GB x 12)
+ %SameSame- 50%
 
Last edited:

TigeRick

macrumors regular
Original poster
Oct 20, 2012
144
153
Malaysia
M-Max SeriesDateMax MemoryM-Ultra's Max Memory
M1 MaxQ4 202164GB 512-bit LPDDR5128GB 1024-bit LPDDR5
M2 MaxQ1 202396GB 512-bit LPDDR5192GB 1024-bit LPDDR5
M3 MaxQ4 2023128GB 512-bit LPDDR5NA
M4 MaxQ4 2024 ?256GB 384-bit LPDDR6512GB 768-bit LPDDR6
2029384GB ?768GB ?



Apple SiliconMemory BusMemory SizeMemory BWMaciPad
M1128-bit LPDDR4X8 GB68 GB/sQ4 2020Q2 2021
M1 Pro256-bit LPDDR5200 GB/s
M2128-bit LPDDR58 GB100 GB/sQ2 2022Q4 2022
M2 Pro256-bit LPDDR5200 GB/s
M3128-bit LPDDR58 GB100 GB/sQ4 2023NA
M3 Pro192-bit LPDDR5150 GB/s
M4128-bit LPDDR58 GB120 GB/sNA ?Q2 2024
M4+ ?96-bit LPDDR612 GB
M4 Pro192-bit LPDDR6



A&M seriesDateNodeTransistorsSLCMemoryCPUGPU
A16Q3 2022N4P16 B24 MB6GB 64-bit LPDDR52P + 4E5
A18Q3 2024N3E8GB 64-bit LPDDR52P + 4E5 RT
A17 ProQ3 2023N3B19 B24 MB8GB 64-bit LPDDR52P + 4E6 RT
A18 ProQ3 2024N3E8GB 64-bit LPDDR52P + 4E6 RT
M3Q4 2023N3B25 B8 MB8GB 128-bit LPDDR54P + 4E10 RT
M4Q2 2024N3E28 B8GB 128-bit LPDDR54P + 6E10 RT
M4+ ?Q4 2024N3B12 MB ?12GB 96-bit LPDDR64P + 8E ??
M3 ProQ4 2023N3B37 B12 MB18GB 192-bit LPDDR56P + 6E18 RT
M4 ProQ4 2024N3B24 MB ?24GB 192-bit LPDDR68P + 8E ??
M3 MaxQ4 2023N3B92 B48 MB36GB 384-bit LPDDR510P + 4E30 RT
M4 MaxQ4 2024N3B36 MB ?36GB 288-bit LPDDR6??
M3 MaxQ4 2023N3B92 B64 MB48GB 512-bit LPDDR512P + 4E40 RT
M4 MaxQ4 2024N3B48 MB ?48GB 384-bit LPDDR6??



A16A17 ProA18A18 ProM4M4+ ?
NodeN4PN3BN3EN3EN3EN3B
Transistors16 b19 b??28 b?
CPU2+42+42+42+44+64+8 ?
FeaturesBase10% Faster CPUML AcceleratorsML AcceleratorsML Accelerators
RAM6GB LPDDR58GB LPDDR58GB LPDDR58GB LPDDR58GB LPDDR512GB LPDDR6
Memory Bus64-bit64-bit64-bit64-bit128-bit96-bit
Memory BW50 GB/s50 GB/s60 GB/s60 GB/s120 GB/s?
GPU Cores56 RT5 RT6 RT10 RT?
GPU Speed1340 MHz1400 MHz1490 MHz1490 MHz
FP32 (TFLOPS)1.79 TF2.15 TF1.91 TF2.29 TF
GB6 Metal22,57727,04732,84853,252
RTNAYesYesYesYesYes
FeaturesBase20% Faster GPU
4X Faster RT
NPU (TOPS)1735353538?
PortsUSB2USB3USB2USB3TB3TB5 ?


M3M4M4+Intel Core Ultra 7 268VAMD Ryzen 9 8945HSAMD Ryzen AI 9 HX370Qualcomm X-EliteMobile RTX-4050
NodeN3BN3EN3BN3B + N6N4N4PN4P4N
Transistors25 B28 B25.4 B18.9 B
TDP17 - 37W15 - 54W50 W
Memory Bus128-bit LPDDR5-6400128-bit LPDDR5-750096-bit LPDDR6128-bit LPDDR5x-8533128-bit LPDDR5x-7500128-bit LPDDR5x-7500128-bit LPDDR5x-844896-bit GDDR6
Memory BW100 GB/s120 GB/s136 GB/s120 GB/s120 GB/s135 GB/s192 GB/s
GPU Cores10 RT10 RT812 CU16 CU20 SM
GPU Speed1400 MHz1600 MHz2000 MHz2800 MHz2900 MHz1755 MHz
FP32 (TFLOPS)3.58 TF4.1 TF4.1 TF4.3 TF (SI)5.94 TF (SI)4.6 TF8.986 TF
GB6 Metal47,48053,252



M2 ProM3 ProM4 ProSTX HaloM3 MaxM4 MaxMobile RTX-4070Mobile RTX-4080
NodeN5PN3BN3BN3E - iGPUN3BN3B4N4N
Transistors40 B37 B92 B22.9 B35.8 B
TDP115 W110 W
Memory Bus256-bit LPDDR5-6400192-bit LPDDR5-6400192-bit LPDDR6256-bit LPDDR5X-8533512-bit LPDDR5-6400384-bit LPDDR6128-bit GDDR6192-bit GDDR6
Memory BW200 GB/s150 GB/s273 GB/s400 GB/s256 GB/s432 GB/s
GPU Cores1918 RT40 CU40 RT36 SM58 SM
GPU Speed1400 MHz1400 MHz2900 MHz ?1400 MHz1696 MHz1665 MHz
FP32 (TFLOPS)6.81 TF6.45 TF14.85 TF (SI)14.34 TF15.62 TF24.72 TF
GB6 Metal81,73278,451154,559




Cortex-X5: ARM's Upcoming Prime CPU: Faster IPC/PPC Than M3 ?

A14A15A16M3A17 ProM4A18A18 Pro8G3 FGX-Elite8G4D9400D9300
Prime CPUCortex-X4PhoenixPhoenixCortex-X5Cortex-X4
CPU Counts2+42+42+44+42+41+3+2+2122+61+3+44+4
GB6 1T209023102566308429082300278027002225
Clock Speed2.99 GHz3.23 GHz3.46 GHz4.06 GHz3.78 GHz3.4 GHz4 GHz3.35 GHz3.25 GHz
Perf Per Clock699715742760769676695806685
+ %+15%+13%+9%+6%+5%+19%+16%Base+18%
 
Last edited:

donawalt

Contributor
Sep 10, 2015
1,284
630
Good analysis. One minor correction that I don't think impacts your thesis - his original quote in his subscription letter (which everyone quoted to varying degrees of accuracy) did not say the entire line would be out within a quarter time period. It will still be close to a year. He said in his newsletter:

Here are the Mac models Apple is working on and when it expects to introduce them (as always, release plans could shift):

  1. A low-end 14-inch MacBook Pro with the M4, coming around the end of 2024.
  2. A 24-inch iMac with the M4, also expected around the end of the year.
  3. New 14-inch and 16-inch high-end MacBook Pros with M4 Pro/Max chips, due between the end of 2024 and early 2025.
  4. A Mac mini in both M4 and M4 Pro configurations, coming between the end of 2024 and early 2025.
  5. New 13-inch and 15-inch MacBook Airs, slated for around spring 2025.
  6. A Mac Studio with a high-end M4 chip, coming around the middle of 2025.
  7. A Mac Pro with an M4 Ultra chip, due in the second half of 2025.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,138
1,899
Anchorage, AK
Already heard last year, did not happen (12GB min).

I can see a couple of reasons why it is more likely this time around. Unlike M3, Apple is not launching 3nm SoCs for the first time. Also, as RAM manufacturers are increasing capacities, that will by default trickle down to the system builders. As far as the timeline for when those configurations may hit the market is concerned, that could differ from what Gurman outlined for reasons beyond Apple's control.
 

StoneJack

macrumors 68030
Dec 19, 2009
2,728
1,982
I can see a couple of reasons why it is more likely this time around. Unlike M3, Apple is not launching 3nm SoCs for the first time. Also, as RAM manufacturers are increasing capacities, that will by default trickle down to the system builders. As far as the timeline for when those configurations may hit the market is concerned, that could differ from what Gurman outlined for reasons beyond Apple's control.
These reasons seem to be wishes
 

leman

macrumors Core
Oct 14, 2008
19,516
19,662
That is an interesting idea! Does LPDDR6 offer provision for 96bit busses? Do you have any additional information for this, I would be curious.

One potential difficulty I see is supply. Apple is often behind the cutting edge on RAM simply because they need volumes that dwarf everything else in the industry. Apple's market might be smaller than some other brands, but every single Apple product is essentially a flagship product. This makes it much more difficult for them to produce the chips in the quantities needed. I was talking to an Apple manager close to iPhone production some time ago and he flat out told me that the main reason why Apple uses relatively small RAM capacities on the iPhone is because otherwise they would crash the DRAM market.

Next year, we should have at least 4 more players entering ARM PC market with Cortex-X5, which is faster IPC than M3/M4.

Do you think there is a high chance of that happening? Cortex-X4 doesn't reach the IPC of M1. Not that M3 IPC is any higher, of course.
 

JustAnExpat

macrumors 65816
Nov 27, 2019
1,009
1,012
I actually believe it. From another post I wrote:

"RAM Defaults in Macs?

What was the default RAM in Macs, how long did it stay, and when did it change? For simplicity, I’ll start from the Steve Jobs era, and I will only focus on the consumer portables.

32MB. This was used in the iBook and iBook Special Edition in July, 1999 until it was discontinued in September, 2000. It lasted 14 months.

64MB. This was used in the Firewire iBooks from September 2000 until the Dual USB iBooks in October 2001. It lasted 13 months.

128MB. This was used in the late 2001 iBooks released in October 2001 through the iBook G4, discontinued in April, 2004. It lasted 32 months

256MB. This was used in the Early 2004 iBook G4 released in April, 2004 through the late 2004 iBook G4 released in October 2004, sold until July, 2005. It lasted 15 months.

512MB. This was used from the Mid 2005 iBook G4 released in July, 2005, until the Mid 2007 MacBook, which was sold until November, 2007. It lasted 28 months.

1GB. This was used in the Late 2007 MacBook released on November, 2007 until the MacBook Late 2008, discontinued in January, 2009. It lasted 25 months.

2GB. This was used in the Late 2008 AL MacBook released on October, 2008, until the MacBook Air 11”, Mid 2011, which was discontinued n June, 2012. This lasted 44 months.

4GB. This was used in the MacBook Air 13”, Mid 2011 released in July, 2011 until the MacBook Air, 13” Early 2015, which was discontinued in June, 2017. This lasted for 71 months.

8GB. This was used as standard on MacBook Air, 2017, released on June, 2017 until now, in November, 2023. 77 months and counting.

Clearly, the amount of time machines stayed at 1GB was half what the time for 2GB, which was almost half of what 4GB. I expect 16GB to become standard in another 40 months or thereabouts, if trends continue. "

So, 16GB as default should happen in about 60 months, or 5 years. But there's no reason why Apple can't increase it by 50%, or 12GB of RAM, and then do another 4GB increase in another 2 1/2 years.

It'll meet the timeline and past trends.
 

fakestrawberryflavor

macrumors 6502
May 24, 2021
423
569
Some of this makes a lot of sense. We will need more ram and more memory bandwidth for on device AI. That's a fact.

Smaller memory busses mean less die spaces used for memory controllers, which eqaul more die space for things that matter like cache, NPU for AI.

Less memory chips required to exceed previous capacity and speed, saving money in BOM

I think LPDDR6 is 'too new' for apple who is fairly conservative. But that's my opinion.

It's also possible the bigger chips Max/Ultra/Extreme whatever adopt HBM on the ultra high end to compete with memory bandwidth of Nvidia GPUs (but that's super speculation territory)
 

Populus

macrumors 603
Aug 24, 2012
5,928
8,404
Spain, Europe
I wholeheartedly wish you are right, and we get bigger and faster RAM with the M4s and don’t have to wait until the M5 gen.

By the way, I don’t understand your update of 4GB I per die. Would that mean that we get stuck with 8GB (2x), 16GB (4x), 32GB (x8) per die?
 

Confused-User

macrumors 6502a
Oct 14, 2014
850
983
This is extremely unlikely. The standard won't be released until Q3. It would be extremely atypical for Apple to support both LPDDR5 and LPDDR6 with its next memory controller, which normally they'd implement once and then use everywhere (Ax/Mx chips of the same generation).

There is *zero* chance LPDDR6 will be available in sufficient quantities for the iPhone 16, despite Tiger's obsession with this technology here and elsewhere. It's ridiculous to even suggest it.

It is vaguely possible Apple would reuse the memory controller from the current generation on the A18, then move to a new one for the M4. I doubt it. Apple is pretty conservative with new tech, and when they're not (mostly just with TSMC nodes) it's bitten them hard at least once in the last few years.

If LPDDR6 ships at all this year, you're likely to see it in a few phones that sell in low volume, and people like Tiger will go wild about it, but then not follow up when analyses show that all that extra bandwidth nets around 2-3% better performance on typical workloads.

It's true that AI and GPU cores want a lot more bandwidth, so it's not crazy to speculate that LPDDR6 will see faster uptake than the previous generation. It's not going to be this year, though.
 

leman

macrumors Core
Oct 14, 2008
19,516
19,662
I think LPDDR6 is 'too new' for apple who is fairly conservative. But that's my opinion.

Apple is not conservative at all. It's just that Apple needs ridiculous production volumes. It is much easier for Samsung to offer a single low-volume flagship phone with cutting-edge RAM than it is for Apple to source this RAM for their 200+ million iPhones per year.
 

theluggage

macrumors G3
Jul 29, 2011
8,009
8,442
Apple has reduced the memory bus of M3 Pro from 256-bit to 192-bit. You may ask why? Cost cutting? Nah, Apple actually increased the RAM from 16GB to 18GB by ordering 48-Gb memory chips from Micron.
I wouldn't read too much into individual spec differences between the M3 Pro and M2 Pro, cost-cutting or not.

The M3 Pro and M3 Max are now two distinctly different dies, rather than the M1/M2 Pro being the Max die with half the memory busses and GPU cores "chopped off". Arguably, the M3 Pro is somewhat less powerful than a more direct progression from the M2 Pro/Max concept might have been. but it has actually made the choice between M3 Pro and Max a lot clearer with all-round upgrades, not just GPU and RAM. Downsizing the memory bus and going from 4 to 3 chips could just be part of that re-positioning.

...and, once you've gone for a 3-chip solution, the capacity has to be a multiple of 3 so the alternative would have been to reduce the base memory from 16GB to 12GB - which would have been a joke (whether or not 4GB chips were available).

I'm not saying you're wrong, but I'm not sure the evidence is compelling. Sooner or later, though, Apple will have to up the base RAM spec of all but the lowest-end MacBook Air from their increasingly indefensible current levels.
 

theluggage

macrumors G3
Jul 29, 2011
8,009
8,442
8GB. This was used as standard on MacBook Air, 2017, released on June, 2017 until now, in November, 2023. 77 months and counting.
Or: the cheapest "Retina MacBook Pro" in 2012:
Apple MacBook Pro "Core i5" 2.5 13" Retina 2012 ($1699) – 8GB

vs.: the cheapest (its all retina now) "MacBook Pro" in 2024:
Apple MacBook Pro "M3" 8 CPU/10 GPU 14" 2023 ($1599) – 8GB

OK, ok, I'd take that comparison with a pinch of salt - its pretty subjective deciding which 2012 model is "equivalent" to which 2024 model, but you can say the same of MacBook Airs over the years. Point is that, even 12 years ago, 8GB wasn't a huge amount of RAM for a laptop costing the thick end of $2000 (and consumer electronics prices don't follow inflation). Since then - well, you can see that the minimum number of CPU cores has increased by a factor of 4, the GPU power has increased hugely, the benchmarks have gone up by over 8x - all of which means that the processor/GPU can process far more data in a given time and therefore needs more RAM to keep it fed with data - especially in an era when phones can shoot 4k HDR video and every web page contains a dozen animated adverts...

...and that has already been reflected by pretty much every computer manufacturer not called "Apple" (or "Microsoft" in case anybody wanted to cherry pick the MS Surface) with most premium ultrabook-style LPDDR5 laptops now starting at 16GB. 8GB base in a > $1000 laptop becomes less and less defensible with every passing month.

If we were just talking about the $999 entry-level M2 MBA (that is fine for some people's needs) it wouldn't be so bad, but the $1600 MacBook Pro starting at 8GB in 2024 is just ridiculous.

Apple are making themselves totally reliant on customers who simply won't/can't consider buying a PC - any discussion with an existing PC user about the merits of Mac and Apple Silicon comes to a shuddering halt when they see how much Apple wants for a half-decent amount of RAM and SSD... and they've now probably missed the honeymoon period when Apple Silicon was head and shoulders above the competition.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
That is an interesting idea! Does LPDDR6 offer provision for 96bit busses? Do you have any additional information for this, I would be curious.

6 * 16 = 96 bit. Isn't hard to do.

More to do with how much cost effective stacking can do than anything to do with the LPDDR6 protocol.


I was talking to an Apple manager close to iPhone production some time ago and he flat out told me that the main reason why Apple uses relatively small RAM capacities on the iPhone is because otherwise they would crash the DRAM market.

That is pretty dubious. Apple's margins get thinner because RAM is more expensive. But crash? Android phones have more RAM than iPhones and the sky isn't falling. Everyone and their monkey's uncle wants Nvidia AI training cards and the market hasn't 'crashed'. Those devices have gotten very expensive, but the market has not 'crashed' in the slightest (there is a higher level of vendor competition now than before). Nvidia is selling everything they produce. What is 'crashing' is folks who need those AI traning cards who are short on cash (or don't have a revenue generating cash flow).


Apple is conservative primarily because it protects their profit margins. Apple's RAM and SSD pricing is at least as much about margins as it is technology. If Apple gave the memory providers an accurate demand projection ~3 years ahead of time and a 'rock solid' contractual obligation to buy that much , it would mostly be there. Apple doesn't want to sign up for the liability and doesn't want the margin risk. For example:



Apple is out there hard-ball haggling for lower prices. More than likely Apple is asking for LPDDR6 at LPDDR5 pricing.

When Apple went to single NAND SSDs in MBA/MBP that wasn't because "crash the market" , it was to keep the margins and pricing.

Next year, we should have at least 4 more players entering ARM PC market with Cortex-X5, which is faster IPC than M3/M4.

Do you think there is a high chance of that happening? Cortex-X4 doesn't reach the IPC of M1. Not that M3 IPC is any higher, of course.

If the M4 SoC allocates the bulk of the improvement R&D budget to NPU cores and GPU cores . Also, if it happened to be a re-spin to N3E ( which makes the die bigger .. which again is going to impact 'new area allocations' ). If it is a "AI hype train" move then it pretty much could be the M3 cores re-spun with some clean ups and perhaps some AMX boosting thrown on top. May not be a revolution there either on regular dick-jane-spot regular application code.


On the ARM side, lots of work and investment is being thrown at the Neoverse cores. Trickle down improvements to the Xn series shouldn't be that hard. As ARM raises the average selling price of the cores they are licensing they can likely do more uplift in generations. If ARM is going to compete with Qualcomm/Apple in the WinPC market the 'normal' that the Xn series is aimed at isn't going to be just smartphones.

That said I suspect "IPC" and Performance are being mixed up. Also IPC on what apps? It is a metric that varies per application measured. From January.

".. Leaks suggest that the upcoming Cortex-X5 is poised to surpass Apple in single-core performance, aligning with Arm's strategic goal of narrowing the performance gap between its official processor architecture and custom Arm platforms. The Blackhawk core achieves the most significant year-over-year IPC performance growth in five years, simultaneously delivering impressive LLM (Large Language Model) performance. ..."


IPC is up but isn't necessary what is putting it over the top in performance

Then about February reports that X5 is also soaking up more power.


That sounds more like the 80W settings on the X Elite where trying to goose the implementation quite high to hit some tech porn marketing targets ( that probably would play decently well in WinPC market to its "more power is good" faction. ). That is tractable when competition with Intel (and somewhat AMD) who are doing the same thing (shooting for lower Perf/Watt just to hit higher drag racing scores and pitch 'higher clock' marketing).

The three tier system in the smartphones will just get more wierd. ( Xn-700-500 ). ARM competition with RISC-V on the bottom ( 500-ish zone ) and custom ARM arch implementors on the top ( Apple P cores , Ampere , Qualcomm) has Arm chasing in different direction that mix of more than a couple core types gets intangled into different 'battlefronts'.
 
  • Like
Reactions: Shirasaki
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.