Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
These are the die shrinks that Intel missed out when they were stuck at 14nm between 2014-2020.

- 2016: 10nm - A11 Bionic & A10X Bionic in 2017
- 2018: 7nm - A12 Bionic & A12X Bionic
- 2020: 5nm - A14 Bionic & M1

Apple may not have switched from Intel if they weren't so greedy.
 

Admiral

macrumors 6502
Mar 14, 2015
408
991
Early benchmarks on M2 Pro and Max indicate a 25-30% uplift in CPU performance over M1 — simply from some tweaks to the design and no material process improvement. If the GPU performance benchmark on Geekbench (I know) is worth anything, M2 Max will improve GPU performance by up to 70% with similarly mindblowing results to look forward to with M2 Ultra. And Apple is advertising a 40% improvement in the performance of the Neural Engine.

M2 is said to be a "stopgap" generation to hold the fort until M3 on TSMC's 3nm process is ready, which some people say could be as early as September.

But the thrill is gone! Gone, I say!
 

Yebubbleman

macrumors 603
May 20, 2010
6,024
2,616
Los Angeles, CA


What the article failed to see is how big of a revolutionary change it was from Intel chip to Apple chip for the past 2+ years.

The article was talking about product marketing. Also the fact that, as was the case when going from those first Core Duo Intel Macs to those first Core 2 Duo Macs in late 2016, the second generation of Apple Silicon more evolution than revolution. Incidentally, anyone with a rational head on their shoulders could've predicted that would be the case.


So I will list it down for you for your convenience
  • 2014-2020 14nm Intel chip (6 years) to Nov 2020-today 5nm Apple chip (2+ years)
  • Intel with low performance IGP vs Apple with high performance dGPU as an IGP
  • Intel chips with minority of ICs & functionality on chip that leads to latency increases and resources not being maximized to Apple chips with majority of ICs & functionality on chip so latency decreased and resources being maximized
  • Intel chips being power hogs & heat generators like a wall-powered desktop to Apple chips being power sipping & barely warm like a battery powered smartphone
Even in 2023 Intel has not sold any product that uses a 5nm process while Apple's already internally prototyping TSMC's 3nm process for use with the iPhone 15 Pro in Sep 2023 with a possible 3nm M3 chip by as early as Oct/Nov 2023 or as late as Jan 2024.
Apple leapfrogged Intel by 6 years of tech improvements and applying what Apple learned from smartphones into their laptops & desktops.

Intel specializes in x86, not ARM. Incidentally, their primary problem from 2014 to the present has been with manufacturing, not engineering. This is not news. It's not that AMD and Apple are kicking Intel's ass. It's that TSMC has been kicking Intel's ass. Intel is now having to (a) do business with TSMC to catch themselves up and (b) revitalize their manufacturing business.

Also, Apple SoC's do not have a "high performance dGPU as an IGP". That's not how that works. They have an integrated GPU that functions entirely differently than Intel's graphics ever have. To lump it in the same category as Intel IGPs is like saying that a Honda Fit and a Ferrari are both cars; it's true, but it misses the point entirely.

Also, the jump from 5nm to 3nm isn't going to be as ground-breaking as you think it will be. It will result in cooler running computers that do more things faster, but it's not going to be the world-shattering difference in performance that you had when going from the last Intel powered 13-inch MacBook Pros to the first M1 powered ones. That was the big jump. We're not likely to have another one of those for a long while, if not until Apple switches architectures again.

If you are concerned about efficiencies, performance per watt, power consumption and waste heat in your devices then you will buy Apple products.

This assumes that most people who buy computers are concerned with such things. I can tell you with confidence, having worked (and continuing to work) extensively with users to help them pick hardware, that they don't. You have people on these forums and elsewhere buying Macs because they are pretty, but it otherwise comes down to which platform and ecosystem you want to buy into. Someone looking to buy a (new, not refurbished) computer model will be torn between a Mac and a PC and power efficiencies of Apple Silicon isn't going to be anywhere near as significant of a concern.


If you are concerned about Windows programs, computer games & ease of repairs/upgrades then you will buy Intel/AMD products.

That's basically my previous point. But again, it's also about ecosystem and which OS someone likes working with more. You sit me at a Mac and I'm just as at home as I am if you sit me at a PC. The only time I have a personal preference is if the Mac you sit me at is running Catalina or newer and I want to play a 32-bit Intel game (in which case, it's Windows time) or if it's work and I'm working an IT job wherein I'm supporting a mixed environment wherein my Mac tools work fine on Windows (in which case, I might as well have a Windows machine).

On my part I am jumping from a 2012 iMac 27" 22nm to a 2023 iMac 27" 5nm that I hope will be out by Jun 2023 during WWDC 2023.

You might have to settle for an M1 Max Mac Studio with a Studio Display. Most signs point to a 27-inch iMac not happening. Apple discontinued it in favor of the Mac Studio + Studio Display combo, priced the pairing accordingly, and it honestly makes perfect sense. Hell, you could get an M2 Mac mini with a Studio Display and that'd still run rings around a 2012 27-inch iMac.

What gets me excited from this upgrade isn't just the industrial design, raw performance and macOS Ventura that sings on Apple chips but the drop in power consumption from >200W to <100W for the same screen size but 2x the Retina resolution and less waste heat that adds load to my air-con that tries to maintain 24c.

There's so much marketing nonsense in this that I honestly can't even...

It would have been awesome if the M2 family of chips was on 4nm or even 3nm but we will have to wait for the M3 for a 3nm chip.

What is it that you want from 3nm that you're not getting from 5nm? Personally, I'm thinking the 13-inch MacBook Air chassis and 14-inch MacBook Pro chassis were built more for 3nm and that the current 13-inch and 16-inch MacBook Pro bodies are the more optimal fits for the M1/M2 and M1/M2 Pro/Max respectively. But it's also the case that the 16-inch chassis (and the 13-inch MacBook Pro chassis, assuming that Mac isn't outright replaced by the 15-inch MacBook Air as has been repeatedly rumored) will also benefit and still remain the better cooled Mac.

Other than a better experience for the (13-inch) Air and the 14-inch Pro, I don't see why you're expecting such a huge jump when they go to 3nm.

I expect the performance to blow my mind.

I expect you will be greatly disappointed.

Going forward Apple has leading die shrink process advantage over Intel/AMD/Qualcomm unless they fumble with the iPhone and cannot order out more than quarter billion iPhones chips annually.

You do know that AMD and Apple both have their chips manufactured by the same company, right? Apple has no significant advantage that AMD doesn't also have.

Microsoft is moving to Arm too. Intel knows they’re in big trouble and have less than two years to get their act together before Dell and Microsoft move to Arm.

Microsoft isn't MOVING to ARM. They're expanding upon ARM as an alternative option to x86. x86 as a Windows (and Linux) hardware platform isn't going anywhere.

Windows on ARM has been rather stagnant. They aren't moving at a pace that Apple did with moving from Intel to Apple chips.

That's because with Apple it's a forced transition; x86 Macs are getting discontinued with only a finite amount of OS support remaining in favor of Apple Silicon (ARM64) Macs being the new hardware platform on which the Mac platform will be based.

Microsoft isn't forcing people to develop for Windows for ARM64. Maybe they ought to. But, for now, they're not. It's an option, not a mandate. Apple tends to deal in mandates, not options.

When Microsoft/Qualcomm/ARM figures how to accelerate to the pace of Apple then Intel/AMD will end up depending on legacy support as their selling point.

That assumes that the entire point of computing is to be running the absolutely fastest running and most efficient processor out there. It's not. Otherwise, Apple would've only given us Intel Macs with Xeons and the only people building PCs would be those doing so with Epycs, Threadrippers, and Xeons. Not the case, as it turns out! When ARM advances enough and there's enough native compatibility with Windows for ARM64, we'll probably see more workloads shifted accordingly, especially in Cloud (if not the datacenter too), but it won't fully replace x86 the way the Mac platform is poised to do with Apple Silicon.

Android chips are at a node more advance than AMD.
Not sure that's accurate. Even if it is, I'm not sure it matters. Process nodes aren't everything.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
The article was talking about product marketing. Also the fact that, as was the case when going from those first Core Duo Intel Macs to those first Core 2 Duo Macs in late 2016, the second generation of Apple Silicon more evolution than revolution. Incidentally, anyone with a rational head on their shoulders could've predicted that would be the case.




Intel specializes in x86, not ARM. Incidentally, their primary problem from 2014 to the present has been with manufacturing, not engineering. This is not news. It's not that AMD and Apple are kicking Intel's ass. It's that TSMC has been kicking Intel's ass. Intel is now having to (a) do business with TSMC to catch themselves up and (b) revitalize their manufacturing business.

Also, Apple SoC's do not have a "high performance dGPU as an IGP". That's not how that works. They have an integrated GPU that functions entirely differently than Intel's graphics ever have. To lump it in the same category as Intel IGPs is like saying that a Honda Fit and a Ferrari are both cars; it's true, but it misses the point entirely.

Also, the jump from 5nm to 3nm isn't going to be as ground-breaking as you think it will be. It will result in cooler running computers that do more things faster, but it's not going to be the world-shattering difference in performance that you had when going from the last Intel powered 13-inch MacBook Pros to the first M1 powered ones. That was the big jump. We're not likely to have another one of those for a long while, if not until Apple switches architectures again.



This assumes that most people who buy computers are concerned with such things. I can tell you with confidence, having worked (and continuing to work) extensively with users to help them pick hardware, that they don't. You have people on these forums and elsewhere buying Macs because they are pretty, but it otherwise comes down to which platform and ecosystem you want to buy into. Someone looking to buy a (new, not refurbished) computer model will be torn between a Mac and a PC and power efficiencies of Apple Silicon isn't going to be anywhere near as significant of a concern.




That's basically my previous point. But again, it's also about ecosystem and which OS someone likes working with more. You sit me at a Mac and I'm just as at home as I am if you sit me at a PC. The only time I have a personal preference is if the Mac you sit me at is running Catalina or newer and I want to play a 32-bit Intel game (in which case, it's Windows time) or if it's work and I'm working an IT job wherein I'm supporting a mixed environment wherein my Mac tools work fine on Windows (in which case, I might as well have a Windows machine).



You might have to settle for an M1 Max Mac Studio with a Studio Display. Most signs point to a 27-inch iMac not happening. Apple discontinued it in favor of the Mac Studio + Studio Display combo, priced the pairing accordingly, and it honestly makes perfect sense. Hell, you could get an M2 Mac mini with a Studio Display and that'd still run rings around a 2012 27-inch iMac.



There's so much marketing nonsense in this that I honestly can't even...



What is it that you want from 3nm that you're not getting from 5nm? Personally, I'm thinking the 13-inch MacBook Air chassis and 14-inch MacBook Pro chassis were built more for 3nm and that the current 13-inch and 16-inch MacBook Pro bodies are the more optimal fits for the M1/M2 and M1/M2 Pro/Max respectively. But it's also the case that the 16-inch chassis (and the 13-inch MacBook Pro chassis, assuming that Mac isn't outright replaced by the 15-inch MacBook Air as has been repeatedly rumored) will also benefit and still remain the better cooled Mac.

Other than a better experience for the (13-inch) Air and the 14-inch Pro, I don't see why you're expecting such a huge jump when they go to 3nm.



I expect you will be greatly disappointed.



You do know that AMD and Apple both have their chips manufactured by the same company, right? Apple has no significant advantage that AMD doesn't also have.



Microsoft isn't MOVING to ARM. They're expanding upon ARM as an alternative option to x86. x86 as a Windows (and Linux) hardware platform isn't going anywhere.



That's because with Apple it's a forced transition; x86 Macs are getting discontinued with only a finite amount of OS support remaining in favor of Apple Silicon (ARM64) Macs being the new hardware platform on which the Mac platform will be based.

Microsoft isn't forcing people to develop for Windows for ARM64. Maybe they ought to. But, for now, they're not. It's an option, not a mandate. Apple tends to deal in mandates, not options.



That assumes that the entire point of computing is to be running the absolutely fastest running and most efficient processor out there. It's not. Otherwise, Apple would've only given us Intel Macs with Xeons and the only people building PCs would be those doing so with Epycs, Threadrippers, and Xeons. Not the case, as it turns out! When ARM advances enough and there's enough native compatibility with Windows for ARM64, we'll probably see more workloads shifted accordingly, especially in Cloud (if not the datacenter too), but it won't fully replace x86 the way the Mac platform is poised to do with Apple Silicon.


Not sure that's accurate. Even if it is, I'm not sure it matters. Process nodes aren't everything.
I simplified and condensed the key points that are important.

It does not help that you do not see it as such and go off tangent to muddle the waters with superfelous details that does not add any value to the post.

There are parts of the world that $/kWh is much higher than where you live. As such halving of power consumption for a device that is turned on for at least 12 hrs/day, 365 days of a year for the next decade is helpful in lowering power bills. If you only see it as marketing nonsense then you are not as smart as you think you are.

You are very unpleasant. In future you may want to write with the aim to be pleasant.
 

robertosh

macrumors 65816
Mar 2, 2011
1,142
967
Switzerland
Flagship intel/amd CPUS are normally used for high-end gaming machines, where you will have a high-end graphic cards that eats even more power. So you will need to put a 1000W power supply anyways. This is what helps them in that regard.

I'm happy with Silicon transition, I think was the best thing Apple did since many years. Not only they manage to pack a tremendous cpu performance/consumption ratio, but also great graphics performance, so it's really awesome.
 

The Mercurian

macrumors 68020
Mar 17, 2012
2,159
2,442
I mean.... if Apple didn't jump to Apple Silicon, Intel would probably still be pushing minor spec bumps of 14nm architecture each year.....:rolleyes:
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
I mean.... if Apple didn't jump to Apple Silicon, Intel would probably still be pushing minor spec bumps of 14nm architecture each year.....:rolleyes:
I would not be surprised that Intel would still be on 14nm. Make up some BS explanation why they are still there when Apple has iPhone 2nm chips in 2024.

Intel stayed at 14nm to save on manufacturing cost. From 2006-2020 they had all PC OEMs as customers. Where the incentive to improve when you have a monopoly? Shareholders will be happy as their dividend will go up and Intel management gets their bonuses.

Apple's quarter million iPhone chip orders to TSMC is the only reason why M1 chips were possible. AMD/Intel ships a quarter million PC chips annually at time when Apple was still using Intel chips.

Intel is in danger of being rendered impotent if Windows 11 on ARM gets momentum. Imagine 2nd best chips to iPhone on a Windows machine?

Only reason why you'd go x86 is because of the legacy software. Give it a decade after new Windows machines are 80% ARM and Intel become a shadow of its former self.
 
Last edited:

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Using various tricks (Arm64, unified memory, better TSMC node, hardware accelerators), Apple has demonstrated that it is capable of making arguably the best SoC for a notebook. But it has not demonstrated that it can do the same with a desktop SoC that performs on par with high-end Intel/AMD CPUs and high-end Nvidia GPUs.

There are a few questions that Apple has not answered. Does Apple have enough tricks left to improve its desktop SoCs? Will TSMC's N3 node allow Apple to show off those new tricks? Can Apple scale its GPU to achieve performance similar to that of high-end Nvidia GPUs within in an "affordable" price range? Will Apple need a node advantage to outperform the competition?
 
  • Like
Reactions: richmlow and klasma

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
Using various tricks (Arm64, unified memory, better TSMC node, hardware accelerators), Apple has demonstrated that it is capable of making arguably the best SoC for a notebook. But it has not demonstrated that it can do the same with a desktop SoC that performs on par with high-end Intel/AMD CPUs and high-end Nvidia GPUs.

There are a few questions that Apple has not answered. Does Apple have enough tricks left to improve its desktop SoCs? Will TSMC's N3 node allow Apple to show off those new tricks? Can Apple scale its GPU to achieve performance similar to that of high-end Nvidia GPUs within in an "affordable" price range? Will Apple need a node advantage to outperform the competition?
M1 Ultra shows that they can and did at the month of release at a power consumption of less than 215W.

You forgot to add "reduced latency by putting relevant ICs onto the chip like putting the dGPU into the chip so it becomes an overachieving iGPU minus the latency penalty.

M2 Ultra and a possible M2 Extreme will push improved 5nm further.
 

DeepIn2U

macrumors G5
May 30, 2002
13,051
6,984
Toronto, Ontario, Canada
Windows on ARM has been rather stagnant. They aren't moving at a pace that Apple did with moving from Intel to Apple chips.

When Microsoft/Qualcomm/ARM figures how to accelerate to the pace of Apple then Intel/AMD will end up depending on legacy support as their selling point.

Android chips are at a node more advance than AMD.

Well Microsoft CAN'T YET accelerate.

Dev kits for Qualcomm's ARM Microsoft units just became available just a few short months ago.
Specifically, Microsoft Visual Basic for ARM is in it's infancy still, along with other mission critical tools.
Microsoft's forward looking policy is to amplify computing and OS needs with connecting to Azure services in the cloud ... which has a VERY stark alert of continually paid Windows OS use as a service on-top of a local OS installation.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
Well Microsoft CAN'T YET accelerate.

Dev kits for Qualcomm's ARM Microsoft units just became available just a few short months ago.
Specifically, Microsoft Visual Basic for ARM is in it's infancy still, along with other mission critical tools.
Microsoft's forward looking policy is to amplify computing and OS needs with connecting to Azure services in the cloud ... which has a VERY stark alert of continually paid Windows OS use as a service on-top of a local OS installation.
Windows on ARM started in 2011.

Microsoft with a dozen years' head start and Apple's about to end its transition by introducing a Mac Pro with Apple silicon before 2024.

Would not be surprised if Apple got the idea to move to ARM chips from Microsoft. lol.
 
  • Like
Reactions: DeepIn2U

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,932
Another benefit of M1 chips are the form factors they enable. Something like the Mac Mini or even the Mac Studio would never be possible with Intel chips simply because of all the heat they produce.

Yes it can. Watch what happens when Intel switches to 3nm of TMSC too. Currently Intel is on 10nm now with the 13th gen chip, but Intel has bought alot of capacity at TSMC at 3nm for the next generation Intel chips.

Intel already has the performance crown, they just simply need TMSC to give them the efficiency and they are going to do it.

If Apple didn't have TSMC and was also stuck using the same 14nm Intel was using, the M1 would not have been efficient too.
 
Last edited:
  • Haha
Reactions: AvgMrcl

DeepIn2U

macrumors G5
May 30, 2002
13,051
6,984
Toronto, Ontario, Canada
Windows on ARM started in 2011.

Microsoft with a dozen years' head start and Apple's about to end its transition by introducing a Mac Pro with Apple silicon before 2024.

Would not be surprised if Apple got the idea to move to ARM chips from Microsoft. lol.

Windows on ARM may have started in 2011 but VERY limited and initially ONLY supported ARM only apps. So I'm not referencing that, yet I should've been very clear there.

Apple stated about 2yrs transition for the entire lineup is long past due to the Mac Pro being delayed and with mixed rumors.

PS: Apple had a head start on the industry with custom ARM chips within the iPhone ;) that was just genius by their engineers.

The idea to move to Arm chips was most likely from ARM expertise in the iPhone lineup year after year leading the industry, especially going 64-bit ARM a full year ahead and surprising Qualcomm and the rest of the entire smartphone industry. I think THAT was the bug in their engineers' minds to start thinking that macOS could go full on.
 

Think77

macrumors regular
Apr 14, 2015
187
170
Another benefit of M1 chips are the form factors they enable. Something like the Mac Mini or even the Mac Studio would never be possible with Intel chips simply because of all the heat they produce.
Agreed. My M1 Mac Mini is 100% silent to my sensitive ears in my home music studio. It takes a huge frustration away from my creative process (coming from an Intel 2010 Mac Pro and before that Intel and AMD Windows workstations). And I can EASILY take the Mini with me, which I often do. So yes, Apple Silicon is about formfactor, noise reduction and efficiency - and savings on the energy.
 
Last edited:

ThunderSkunk

macrumors 601
Dec 31, 2007
4,075
4,560
Milwaukee Area
MS ditches Intel and moves to ARM, and every business and company of every kind in the entire world just happily throws out all their intel computers and 40 years of purchased, proprietary & custom software and kisses all its data goodbye. Sure, why not.
 

AllThingsTruth

macrumors newbie
Nov 2, 2022
14
232


What the article failed to see is how big of a revolutionary change it was from Intel chip to Apple chip for the past 2+ years.

So I will list it down for you for your convenience
  • 2014-2020 14nm Intel chip (6 years) to Nov 2020-today 5nm Apple chip (2+ years)
  • Intel with low performance IGP vs Apple with high performance dGPU as an IGP
  • Intel chips with minority of ICs & functionality on chip that leads to latency increases and resources not being maximized to Apple chips with majority of ICs & functionality on chip so latency decreased and resources being maximized
  • Intel chips being power hogs & heat generators like a wall-powered desktop to Apple chips being power sipping & barely warm like a battery powered smartphone
Even in 2023 Intel has not sold any product that uses a 5nm process while Apple's already internally prototyping TSMC's 3nm process for use with the iPhone 15 Pro in Sep 2023 with a possible 3nm M3 chip by as early as Oct/Nov 2023 or as late as Jan 2024.
Apple leapfrogged Intel by 6 years of tech improvements and applying what Apple learned from smartphones into their laptops & desktops.

If you are concerned about efficiencies, performance per watt, power consumption and waste heat in your devices then you will buy Apple products.

If you are concerned about Windows programs, computer games & ease of repairs/upgrades then you will buy Intel/AMD products.

On my part I am jumping from a 2012 iMac 27" 22nm to a 2023 iMac 27" 5nm that I hope will be out by Jun 2023 during WWDC 2023.

What gets me excited from this upgrade isn't just the industrial design, raw performance and macOS Ventura that sings on Apple chips but the drop in power consumption from >200W to <100W for the same screen size but 2x the Retina resolution and less waste heat that adds load to my air-con that tries to maintain 24c.

It would have been awesome if the M2 family of chips was on 4nm or even 3nm but we will have to wait for the M3 for a 3nm chip.

I expect the performance to blow my mind.

Going forward Apple has leading die shrink process advantage over Intel/AMD/Qualcomm unless they fumble with the iPhone and cannot order out more than quarter billion iPhones chips annually.
Thank you for posting this. I saw that headline yesterday and clicked on it before quickly jumping off of it. I did not want to read and suffer through someone’s complaining and negative and short sighted outlook on this all.

Personally, I am still giddy about these chips. In general, I am thankful for all of the cool tech making its way into our hands and pockets. I think people quickly forget how different tech was just only 10 years ago. Pick up an iPhone 4 and you can clearly see how different technology is in such a short time. It is amazing and something to be thankful for. When people post a negative outlook, with undertones of complaining, which are the vibes that title totally gives off, it’s so short sighted and exhausting. It draws us away from being thankful.

The roaring 20’s were ushered in by the previous decade, which saw the end of the second industrial revolution. We know how the roaring 20’s ended, how that time of prosperity came to a screeching halt. I wonder how many people in the demise of the aftermath of it all, wished they would had been more thankful for the great increase in technology they had readily available in the roaring 20’s that made way leading up to that prosperous decade? The similarities are striking, and the complaints about “minimal” gains or improvements, losing sight of how far we have come because of failing to take a step back, is not good, all things considered.

While the complaints are burdensome, I truly thank you for posting a more positive outlook about this all. It is a breathe of fresh air and I am glad someone said something about it.

I am excited about the current line-up of products and tech in them and I am also excited about what we are hearing is planned for the future. It is truly something to be thankful for.

Thank you for posting your excitement and optimism!
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
Thank you for posting this. I saw that headline yesterday and clicked on it before quickly jumping off of it. I did not want to read and suffer through someone’s complaining and negative and short sighted outlook on this all.

Personally, I am still giddy about these chips. In general, I am thankful for all of the cool tech making its way into our hands and pockets. I think people quickly forget how different tech was just only 10 years ago. Pick up an iPhone 4 and you can clearly see how different technology is in such a short time. It is amazing and something to be thankful for. When people post a negative outlook, with undertones of complaining, which are the vibes that title totally gives off, it’s so short sighted and exhausting. It draws us away from being thankful.

The roaring 20’s were ushered in by the previous decade, which saw the end of the second industrial revolution. We know how the roaring 20’s ended, how that time of prosperity came to a screeching halt. I wonder how many people in the demise of the aftermath of it all, wished they would had been more thankful for the great increase in technology they had readily available in the roaring 20’s that made way leading up to that prosperous decade? The similarities are striking, and the complaints about “minimal” gains or improvements, losing sight of how far we have come because of failing to take a step back, is not good, all things considered.

While the complaints are burdensome, I truly thank you for posting a more positive outlook about this all. It is a breathe of fresh air and I am glad someone said something about it.

I am excited about the current line-up of products and tech in them and I am also excited about what we are hearing is planned for the future. It is truly something to be thankful for.

Thank you for posting your excitement and optimism!
I am just thankful to Intel for being so greedy that it forced Apple into a 4th transition.

Apple was at the right place, right time, right tech, right market cap & right chip talent to execute this so well.

Without the iPhone Apple would not have enough chip volume to justify the M1 chip. Most of the R&D spend on the Mac chip was already sunk in by iPhone R&D efforts. So whatever Mac-specified cores or other features is just a bit more money that can easily be subsidized by Mac sales.

It blows my mind that the last Intel Mac was on a 14nm chip in Aug 2020 and weeks later the 1st Mac with Apple silicon would be using a 5nm chip.

- power consumption improvement
- performance per watt improvement
- waste heat improvement
- physical size improvement
- latency improvements

Yes, future Mac chips will not get a revolutional jump in improvements like in Nov 2020 but so long Apple gets first stab at any and all future die shrink chip improvement then we'll always be leading edge.
 

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,932
MS ditches Intel and moves to ARM, and every business and company of every kind in the entire world just happily throws out all their intel computers and 40 years of purchased, proprietary & custom software and kisses all its data goodbye. Sure, why not.

And it's not like other ARM chips are fast. The only ARM chip that is fast for consumers is the iPhone chip (M1 is the iPad version of the A14 so it's really the same chip in the end).

Other consumer ARM chips are slow as hell.
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
M1X was amazing and the M2X points to a 20-40% improvement without a node size shift. If that is the approximate pace Apple is going to improve the SoCs YoY, we soon need some more demanding software and that is thrilling.

The MX chips seems unbeatable in laptops, AIO, and non internally expendable "minis" which constitute a very large proportion of (Apple) computer sales. I think it was a wise choice to address a large segment rather than halo product.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
I never consider the use of ARM thrilling or that Mac computers are less exciting. I guess I'm one of those oddball users who picks computers based on needs. I love Apple's design and they did some amazing fetes of engineering on the M1, so don't get me wrong but I was not all giddy about the M1 or Apple's embracing of ARM processors for the computers.

I'm not disappointed, or down on apple, its a great move, but its a journey, and the M2 is a minor update. Intel and AMD go through the same thing. One processor may bring significant innovation and the subsequent versions build upon that.
 

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,932
M1X was amazing and the M2X points to a 20-40% improvement without a node size shift. If that is the approximate pace Apple is going to improve the SoCs YoY, we soon need some more demanding software and that is thrilling.

The MX chips seems unbeatable in laptops, AIO, and non internally expendable "minis" which constitute a very large proportion of (Apple) computer sales. I think it was a wise choice to address a large segment rather than halo product.

Let's take a closer look.

M2 Pro = 3.5 ghz
M1 Pro = 3.2 ghz

% clock speed = 10%

M2 Pro CPU cores = 12
M1 Pro CPU cores = 10

% CPU cores. = 20%

M2 pro GPU cores = 19
M1 Pro GPU cores = 16

% GPU cores = 20%

There might not be a node shift, but Apple got the "M2 Pro" speed by making it a "M1 Pro Max hybrid" as it is a significantly bigger Silicon that is run at higher clock speeds. I'm not impressed by it.
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
Let's take a closer look.

M2 Pro = 3.5 ghz
M1 Pro = 3.2 ghz

% clock speed = 10%

M2 Pro CPU cores = 12
M1 Pro CPU cores = 10

% CPU cores. = 20%

M2 pro GPU cores = 19
M1 Pro GPU cores = 16

% GPU cores = 20%

There might not be a node shift, but Apple got the "M2 Pro" speed by making it a "M1 Pro Max hybrid" as it is a significantly bigger Silicon that is run at higher clock speeds. I'm not impressed by it.
You are actually only counting cores and MHz...As far as I understand, the power draw is the same or less despite the larger chip and increased performance. Then who cares how large the chip is? In the end we got 20-40% performance increase between two generations (must be some GPU architecture advances involved to explain this). Read Intel history and then you will understand how amazing this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.