Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
But for Windows Arm to get traction Microsoft has to get serious about it. Broker a settlement between Qualcomm and Arm regarding Nuvia and commission a custom chip with a hardware-assisted Rosetta-like feature that makes Windows users indifferent as to whether they have an Arm or x86-64 chip inside.

This has a bucket load of assumptions built into it that largely boil down to "Microsoft has to do exactly what Apple did". The huge problem with that is that Microsoft isn't Apple. Apple silicon has one , and only one customer. Microsoft deals with dozens of system vendors. Microsoft is in the provide systems covering everything to everybody business and Apple is not.

A proprietary chip that is forked off of Arm really isn't a long term viable path. Windows needs to run a standard Arm instruction set chips from multiple vendors so that an even higher multiple systems vendors can make an order of magnitude broader variety of products than Apple makes. The 'fan out' is in a completely different direction.

Arm itself is stepping up the performance of X-series cores. [ RISC-V is eating away at the embedded market. The sever and PC market with higher price points is where they pretty much have to go. ]

Microsoft don't need Qualcomm specifically. If Arm , MediaTek , Nvidia roll out a solution over next 2-3 years that would work just fine over the long term. The root cause problem issue is not trying to simply reuse mainstream , affordability cost optimized smartphone chips for the solution.

The Rosetta-like isn't an issue. Microsoft has layered virtualization issues to cover that Apple punted on. The path their emulator is on is for mixed binary apps. Which Rosetta doesn't do at all. Windows has a different set of customers ( for better or worse). Substantial number of those customers have a problem with being told to throw there 32 bit apps out the window. Or that all your plug-ins are toast.

Likewise Rosetta ignores modern AVX. There is no way Windows can follow that path either.

Microsoft Windows is not going to radically dump x86-64 Windows as fast a possible. Microsoft can't dictate that to the entire ecosystem of system vendors. Windows 11 finally killed off the 32-bit kernel. Several years after Apple, but not really seriousness issue , but one of timing coordination with user base and system vendors. How many years did it take to dump BIOS boot... (oops it is still there. )



Officially sanctioning Windows Arm on Apple Silicon Macs was a nice first step,

First step? Windows was on Arm before Apple shipped anything. One of the issues that Microsoft had is that they were super duper serious about 32-bit app emultation on Arm when Apple did nothing ( and fundamentally punted the issue in their software stack). It is't 'seriousness' . It has been different focus areas.

Microsoft has been trying to shoehorn Windows into Windows Phone and Windows RT for much longer than Apple has been trying to move the macOS over. Those were serious attempts. Didn't work exteremely well , but MS was serious.

Windows on Arm in a VM on Apple Silicon is a minor side show that isn't strategic for either company. Windows on Arm was going to run in a VM anyway; even if Apple didn't move. Microsoft has been tapped into server Arm chips for more than several years. There have been lots of hiccups along that evolutionary path, but Microsoft has been following along the whole way. The notion that it never crossed their minds to run Windows in a VM on those server chips until Apple publicly disclosed Apple Silicon is comical.


but there aren’t enough Mac users total, much less Mac users running Windows, for that to be more than a minor bump for Windows Arm.

Was an even more minor but for Windows x86. What significantly matters for Windows is being sold bundled with the hardware. these loosey-goosey image sales are the minor bump regardless of processor instruction set implementation.
 
  • Like
Reactions: esou and pshufd

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
At this point , it has been several years. Apple’s support horizon is not all that vague. The Vintage and Obsolete classification is generally 5-7 years after the product stops being sold. A very clear pattern has developed over last 3 years that there is no ‘slop’ on the lingering Intel product. It is basically just 5 years.

Apple didn’t drop the upper end MacBook Pro until 2021. And it was late 2021 (October) so those Intel laptops 'caught' the macOS beta of 2021 (June-September). 2021-22 + 5 is about 2026-2027. Slipping out of the zone of 'several' years at this point and more pragmatically down to a 'couple' (~2) of years.

Compounded on top of that, Apple dragged their feet on the "Mini Pro" (were still selling Intel Mini's through 2022) and Mac Pro. The volume there is not large enough to make a difference on the Intel Mac, supported pool size. The very high volume Macs (MBA 13" , MBP 13" , and smaller iMac + consumer Mini ) have all been off since mid-2021. Their ancestors have been aggressively dropped for around 3 years now. At this point 10's of millions of Intel macs are dropping off the support list each year. The Intel stuff being sold in 2022 wasn't counterbalancing that stuff in the slightest.


However, if there was a 'tie breaker' between choosing 2026 or 2027 ... 2027 would likely product less grumbling and friction. It isn't 5 years from end of all Intel sales but it is pretty close.


Finally, there is an extremely high chance that Apple's original plan was to finish the transition in two years because ... well... they explicitly said "about 2 years". So 2022 + 5 -> 2027. [ That they finished in 2023 was likely more the 'pandemic' and external factors screwing up the schedule. ]

2027 is 3 years so that could be 'several years'. Apple may pull a Scrooge McDuck though in that is where the security updates end; not the new OS updates. Apple's 'n-2' security updates have been getting more increasingly 'lame' over last several years. "new" updates I suspect stops in 2026. That would make 2027 "n-1" coverage on just security updates. And 2028 a 'maybe' if the narrow scope of the security update is small enough that it is doesn't make a substantive difference on resource load. ( they may only need a restricted subset of the Intel MacOS libraries for the Rosetta stuff anyway. If the security 'holes' were outside of that , then they could just skip them. The 'apps' and drivers that get bundled with the OS more likely just 'dead'. )


P.S. Apple introduced DriverKit as a replace for Kernel extensions (kext) in WWDC 2019.


2019+7 is 2026. Probably Kext are dead past 2026 ( if not sooner). At some point Apple is going to close up the security holes that kext represent. I doubt Apple is going to want to put that removal work into the Intel version of the CoreOS. Again, this is one of those "change is coming" warnings issued several years ago. So it is not being "several years more" until the end comes.
 
Last edited by a moderator:
  • Like
Reactions: pc297 and Chuckeee

KPOM

macrumors P6
Oct 23, 2010
18,308
8,320
This has a bucket load of assumptions built into it that largely boil down to "Microsoft has to do exactly what Apple did". The huge problem with that is that Microsoft isn't Apple. Apple silicon has one , and only one customer. Microsoft deals with dozens of system vendors. Microsoft is in the provide systems covering everything to everybody business and Apple is not.

A proprietary chip that is forked off of Arm really isn't a long term viable path. Windows needs to run a standard Arm instruction set chips from multiple vendors so that an even higher multiple systems vendors can make an order of magnitude broader variety of products than Apple makes. The 'fan out' is in a completely different direction.

Microsoft Windows is not going to radically dump x86-64 Windows as fast a possible. Microsoft can't dictate that to the entire ecosystem of system vendors. Windows 11 finally killed off the 32-bit kernel. Several years after Apple, but not really seriousness issue , but one of timing coordination with user base and system vendors. How many years did it take to dump BIOS boot... (oops it is still there. )

First step? Windows was on Arm before Apple shipped anything. One of the issues that Microsoft had is that they were super duper serious about 32-bit app emultation on Arm when Apple did nothing ( and fundamentally punted the issue in their software stack). It is't 'seriousness' . It has been different focus areas.

Windows on Arm in a VM on Apple Silicon is a minor side show that isn't strategic for either company. Windows on Arm was going to run in a VM anyway; even if Apple didn't move. Microsoft has been tapped into server Arm chips for more than several years. There have been lots of hiccups along that evolutionary path, but Microsoft has been following along the whole way. The notion that it never crossed their minds to run Windows in a VM on those server chips until Apple publicly disclosed Apple Silicon is comical.
To date Microsoft has approached Windows Arm quite a bit differently from x86 Windows. They don't just make it available generally so that any OEM can bundle it with hardware. They have licensed it to a select group of OEMs and sell it primarily for their own Surface devices. The small volume of Apple Silicon Macs running Windows Arm doesn't mean that Microsoft couldn't have threatened Corel with a cease-and-desist letter. The fact that they didn't perhaps indicates more openness to making Windows Arm available on a broader scale. But for Windows Arm to succeed, it needs to run legacy apps better. Microsoft has made decent strides in improving the compatibility of Windows Arm, but it doesn't match the performance of Rosetta on macOS, likely for the reasons you mentioned.
 
  • Like
Reactions: pc297

KPOM

macrumors P6
Oct 23, 2010
18,308
8,320
At this point , it has been several years. Apple’s support horizon is not all that vague. The Vintage and Obsolete classification is generally 5-7 years after the product stops being sold. A very clear pattern has developed over last 3 years that there is no ‘slop’ on the lingering Intel product. It is basically just 5 years.

However, if there was a 'tie breaker' between choosing 2026 or 2027 ... 2027 would likely product less grumbling and friction. It isn't 5 years from end of all Intel sales but it is pretty close.

If it is 2026, we'll probably get an announcement at WWDC this year that support for x64 code will be deprecated "in a future release of macOS," and macOS 15 will probably alert users the first time they open an Intel-based app. Even today some commercial Mac software is still Intel-only. TurboTax is one example.
 
  • Like
Reactions: AAPLGeek

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
If it is 2026, we'll probably get an announcement at WWDC this year that support for x64 code will be deprecated "in a future release of macOS," and macOS 15 will probably alert users the first time they open an Intel-based app.

Unlikely Apple is going to preannounce the end. At some WWDC there just won't be an Intel version of the new beta version. That really is not a deprecate announcement. ( e.g., at the WWDC 2020 the GPU drivers for AMD/Intel just were not there anymore on the macOS on Apple Silicon side of the matrix. )

x86 code was pragmatically announced as deprecated back in 2020! Apple didn't announce that long term they were going to ship x86 devices. They said there was a (or very limited ) new x86 coming in 2020 , but that was it. They were transitioning away from x86. As 'new systems' they were deprecated; no new x86 systems have come. There is no long term "future" for x86 macOS .

Apple really doesn't view the Mac as hardware that happens to have software . Or software that happens to have custom hardware. The tend to treat ( look at your licensing terms for example) as a whole system. So if the hardware 'half' is 'dead' then the software 'half' is extremely likely just as 'dead'. No new x86 hardware is going to kill off the x86 software.

In the past, Apple sold the OS upgrades and the software on a disk. Apple doesn't sell detached macOS software anymore. ( yet another major difference from Microsoft. The companies are NOT doing the same thing. Microsoft is drifting more toward just hard bundled licenses that have a lifetime tied to the hardware, but the variety of vendors and inertia is orders of magnitude higher in their business market. Apple isn't making major moves to rent Macs in a cloud either ( XCode cloud really is not that in any general sense). ) . Apple takes the upgrade money up front. You pay for macOS upgrades when you initially pay for the system. That is the pot of money paying for upgrades. If there are no new sales, then no new upgrades feeding that line. As your system ages out, the money for upgrades for that system are consumed. It is a finite resource pool with an end. Not a Ponzi scheme.

Even today some commercial Mac software is still Intel-only. TurboTax is one example.

Who is the bigger fool? The Fool or the Fool who follows the Fool?

No one with a serious Mac software development team is doing that. No universal app 4 years after 10's of millions of M-series Macs are shipping? There is something significantly wrong with that development team if having finished the effort to get to a universal build by now. Either

i. zombie dev team. Nobody really there with little or no updates . If there are not enough buying/paying for the software's development then Apple is not going to drag them along.

ii. code base that is so fragile/brittle don't want to make any changes (to follow new APIs or changes). That app is in trouble whether a AS native version or not.

iii. code base dependent upon some 3rd party ( porting or utility ) that is either in state i and ii above.

iv. apps with large amount of code/structure committed to OpenCL/OpenGL/etc ( some API that Apple has tossed on deprecated list). Apple "use Metal or go away" attitude is going to loose some apps from the platform. Most though are going to 'cash cow" their way off the platform.




Turbo Tax ... Chuckle. The FTC just banned them from advertising.


The 'commerce' being focused in that shop is on hustling folks as opposed to responsibly doing their job. TurboTax is probably trying to cater to folks running older versions of macOS. Over time the number of folks down in those n-4, n-5 version shrinks and then they move forward. In a couple of years, 2021 is going to be in the n-5 range and at that point not covering Apple Silicon is more than a little nutty. Even the 'slow poke' developers that are actually moving forward over time are going to get there.



There is always "commercial" software that is falling out of commerce ( substantively being sold). There will always be something. What is really at issue is how much new software being sold and at what relative revenue levels? Old stuff that folks might use can go into a VM running an old macOS version. Into a VM is pretty much where 'stuck in time' software should go over the long term.



P.S. I scrolled through


and some of the entries there like Abelton Live and QuarkXpress are stale ( relatively recently native versions have shown up. )

The website "doesitarm.com"


lists about 11% (10.7% ) as being in Rosetta status. Pretty sure that has substantially shrunk since Q4 2021 levels. Some wayback links. ( some apps come out of the "unknown" class and into the other 3 over time.)


March 2021 25%

Oct 2021 15%


March 2023 11.7%
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Quite a bit is. I have around 450 apps on my machine and I tend to keep them up to date, but more than 100 of those are Intel-only.

22% is less than a 1/4. If a Congressman had a 22% approval rating would 'quite of bit' of folks think he was doing a good job?

If this is the count from the System Report's Application section, there is an easy way to sort the apps that is informative here. "Last Modified" so that oldest is first. Take a note of the density of "Intel" there over the first 'page' or two of the list. Then flip it to newest first and then look at the first 1-2 pages of the apparence of "Intel".

If get the same density of Intel appearances, then "keep them up to date" is a factor. If it is a completely different density then "keep them up to date" isn't.

The other attribute to look at is where this stuff is stored. When in "oldest first" mode click on some of those apps and notice if they are somewhere like /Applications/ or /Library/... versus ~/Library/Application Support/ (where ~ is your home folder). A fair amount of apps will have an Application Support folder . For example

~/Library/Application Support/RandomApp/

and there may be one or two subfolders down there. If the app later changes its mind as to where the 'root' for the helper apps are ( from RandomApp/foo to RandomApp/bar ) then your home directory tends to collect zombie apps. Also 'drag to trashcan' as an uninstall method also tends to leave zombie apps (the App Support folders don't get cleaned up) . Sloppy Installer programs will just get the 'new stuff' put in the right place and not 'clean up' obsolete binaries that might be on different path names that the updated app won't reference.

Apple's migration Assistant tends to do a good job of tagging 'obsolete' apps that are in /Applications but missing stuff in individual user folders. ( I have looked at several Macs and found 32-bit apps lingering around in crufty crevices, but show up on the System Report. When Apple said 32-bit are dead , they were dead. )
 

KPOM

macrumors P6
Oct 23, 2010
18,308
8,320
Who is the bigger fool? The Fool or the Fool who follows the Fool?

No one with a serious Mac software development team is doing that. No universal app 4 years after 10's of millions of M-series Macs are shipping? There is something significantly wrong with that development team if having finished the effort to get to a universal build by now. Either

i. zombie dev team. Nobody really there with little or no updates . If there are not enough buying/paying for the software's development then Apple is not going to drag them along.

ii. code base that is so fragile/brittle don't want to make any changes (to follow new APIs or changes). That app is in trouble whether a AS native version or not.

iii. code base dependent upon some 3rd party ( porting or utility ) that is either in state i and ii above.

iv. apps with large amount of code/structure committed to OpenCL/OpenGL/etc ( some API that Apple has tossed on deprecated list). Apple "use Metal or go away" attitude is going to loose some apps from the platform. Most though are going to 'cash cow" their way off the platform.




Turbo Tax ... Chuckle. The FTC just banned them from advertising.


The 'commerce' being focused in that shop is on hustling folks as opposed to responsibly doing their job. TurboTax is probably trying to cater to folks running older versions of macOS. Over time the number of folks down in those n-4, n-5 version shrinks and then they move forward. In a couple of years, 2021 is going to be in the n-5 range and at that point not covering Apple Silicon is more than a little nutty. Even the 'slow poke' developers that are actually moving forward over time are going to get there.



There is always "commercial" software that is falling out of commerce ( substantively being sold). There will always be something. What is really at issue is how much new software being sold and at what relative revenue levels? Old stuff that folks might use can go into a VM running an old macOS version. Into a VM is pretty much where 'stuck in time' software should go over the long term.



P.S. I scrolled through


and some of the entries there like Abelton Live and QuarkXpress are stale ( relatively recently native versions have shown up. )

The website "doesitarm.com"


lists about 11% (10.7% ) as being in Rosetta status. Pretty sure that has substantially shrunk since Q4 2021 levels. Some wayback links. ( some apps come out of the "unknown" class and into the other 3 over time.)


March 2021 25%

Oct 2021 15%


March 2023 11.7%
TurboTax is still one of the best products as far as actually computing taxes and integrating with financial software go. The FTC claims were because they were pushing their “free” product which doesn’t work for most taxpayers. Why they haven’t ticked the box in XCode to make it universal is beyond my comprehension. Tax software should be functional for at least 3 years and ideally 6 years based on the statute of limitations for the IRS to initiate an audit. “Sorry IRS, I can’t open my old tax return because macOS 18 doesn’t run TurboTax 2024 anymore” won’t fly.
 
  • Like
Reactions: AAPLGeek

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
22% is less than a 1/4. If a Congressman had a 22% approval rating would 'quite of bit' of folks think he was doing a good job?
That's not the correct way to look at the 22% figure. @Nermal was using 100/450=22% to indicate the reduction in the number of apps that would run natively if they switched to AS.

A 22% reduction can be a decent amount. For example, if you had a 22% pay cut (say from $100k to $78k), or 22% loss in the value of your 401K, most folks would indeed say that was "quite a bit".

Logically, you're conflating the use of a percentage to indicate how much you have (where, for example, 20-40% is not much) with how much you've lost (where 20% – 40% can be a lot). Those are apples and oranges.
 
Last edited:

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
To date Microsoft has approached Windows Arm quite a bit differently from x86 Windows. They don't just make it available generally so that any OEM can bundle it with hardware. They have licensed it to a select group of OEMs and sell it primarily for their own Surface devices.

That is Microsoft restricting them? Or the vendors self selecting out of buying from Qualcomm? Samsung and Lenovo did systems, but they also buy Qualcomm chips for cells phones. Toss in the system vendors for Chromebooks that use Arm chips for that and add those to this "select group". And how many in the select group are left out?


Additionally some of the rumblings around X Elite has been trickle down from some system vendors complaining about having to use Qualcomm power management chips as opposed to their favorite PMIC that they use for Wintel boxes. Is that a mandate by Microsoft? Not.

The Windows market has a substantial number of 'race to the bottom", marginal system vendors. Microsoft trying to force those folks into doing a Arm systems when they don't want to is a highly dubious plan. (e.g, extreme race to the bottom Chromebooks were a dual edge sword for a long time. Stratically they hurt as much as they helped. )


Very similar issues as to why Microsoft was only primarily working with Qualcomm. Was there are faster, higher performance Arm SoC vendor they could have choose? Nope. So not sure why some grand conspricacy needed to be invented as to why the other possible vendors got excluded. If throw who has best celluar modem and best arm SoC onto the requirements pile it even more gets to the point of "Duh? who else are they going to choose" situtaion.

Surface is only useful for Microsoft in nudging their system vendor partners to get out of the 'race to the bottom' system and trying random, quirky features to try to differentiate themselves.





But for Windows Arm to succeed, it needs to run legacy apps better.

If the Windows emulation subsystem has a 10% overhead 'hit' and the Arm processor lags the Intel/AMD alternative processor by 35% which one is the bigger hit to running legacy apps better? If improve the emulation subsystem to 5% overhead 'hit' ... which one is still the bigger hit?

Microsoft was worked with what they had; not what they wished they had.


If get a system where the Arm processor uplift is 10% over some subset of the Intel/AMD line up and the emulation is a 10% hit then it is an entirely different ball game.

Handwaving at the licensing terms isn't going to solve that issue.


Handwaving at Apple SoC isn't going to solve that issue either. Apple isn't selling to anyone else. It is not a viable option.

What Microsof had was options that did better on battery life and 'always on' connectivity. They focused on that after after spending some time messing around with Windows RT.

They need performance to get to "legacy"/competitive CPU/GPU peformance. That means moving past 32-bit Arm ; not getting quagmired in 'maximum' 32-bit (or less) app focused. Arm64 is better performing than Arm32.




Microsoft has made decent strides in improving the compatibility of Windows Arm, but it doesn't match the performance of Rosetta on macOS, likely for the reasons you mentioned.

Rosetta2 dumps 32-bit mac apps behind. virtualized apps behind. drivers / kext ... behind. AVX ... behind. It isn't about maximum legacy app coverage at all.

The performance comes from two parts. First, Rosetta complies to Arm binaries (and stores them on disk except from some narrow special cases of JIT code (e.g., javascript ) ). Primarily that speed is basically the same as native Arm code execution speed. Rosetta really isn't doing anything magical here. Primarily what you are getting is a "JIT" universal translation app. As far as how fast it does basic stuff like add , subtract , multiple , divide , compute a branch test ... that is all just native Arm code executure speed.

Second, the Rosetta leverages a special mode in the translated code to cover up the semantic gap between Arm's loosey-goosey memory ordering and x86's more transactional/normal semantics. That helps, but it also not 80+% of the contribution to the overal performance. If those additions were added to a A9 and stuffed into a Mac ... there would be no magical 'win'.
Google has been dragging their feet on getting a native Chrome done. ( Slightly nutty given that MS Edge is chromium based so that vast bulk of the 'chrome core' is native. All they need to work on is the relatively narrow range of 'google specific bits of Chrome. The rest has been done for years).


Surprise , surprise , surpise... now that a reasonably fast native code SoC is about to show up ... they are now interested in doing the work. If there are no native apps that shine ... the legacy stuff isn't going to matter. An OS running 90% legacy apps for an extende period of time is dinosaur waiting for the meteor to hit.

The Microsoft converter is fundamentally the same as Rosetta.... it compiles the x86 to Arm binary code.

A customer that is 90+% laser focused on running 10-15 year old hyper legacy 32-bit x86 binaries can just buy Widows on x86. Apple is sell zero x86 systems so need an incrementally better converter because they are burning all the x86 bridges behind their users . Apple is primarily trying to move folks forward ... not keep them happy in a isolated time bubble for the maximum amount of time.
 

pshufd

macrumors G4
Oct 24, 2013
10,145
14,571
New Hampshire
TurboTax is still one of the best products as far as actually computing taxes and integrating with financial software go. The FTC claims were because they were pushing their “free” product which doesn’t work for most taxpayers. Why they haven’t ticked the box in XCode to make it universal is beyond my comprehension. Tax software should be functional for at least 3 years and ideally 6 years based on the statute of limitations for the IRS to initiate an audit. “Sorry IRS, I can’t open my old tax return because macOS 18 doesn’t run TurboTax 2024 anymore” won’t fly.

I really don't have a choice in tax programs. I have complex returns and TurboTax saves me about 30 hours of tax prep work and it has found me tens of thousands in credits and deductions that other programs didn't find for me in the past. Fidelity also gives it to me for free every year.

Performance is good enough on my M1 Max via Rosetta 2.

I could also run it on my 2015 iMac but I think that my M1 Max will outperform it, even having to go through Rosetta 2.

I don't understand why Intuit doesn't just do the small amount of work to make a native image but the company has been pushing hard for the past 8 years to migrate their customers to the cloud and it may be that that's where the bulk of their development goes.
 
  • Like
Reactions: theorist9

pc297

macrumors 6502
Sep 26, 2015
336
207
Compounded on top of that, Apple dragged their feet on the "Mini Pro" (were still selling Intel Mini's through 2022) and Mac Pro.
Precisely, and this gives us a rough timeframe if history tells us anything; in the last two transitions (68k->ppc and ppc->intel), any new macOS version ditching a previously supported architecture has happened about 3 years after said architecture was dropped from all new models (not sure about 6502->68k though).

So we're probably looking at fall 2026 for an Apple silicon-only version of macOS; until then even though Intel macOS will ultimately be meant for 2019 Mac Pros only (that sold until spring last year), we're likely to have OCLP support for those versions on most EFI64 machines.

So yeah, come 2026 we should closely watch and keep an eye on betas coming out and try and examine intel content in the binaries and try and get them to work on Intel if possible, as was done years later for 10.6 and ppc (https://forums.macrumors.com/threads/snow-leopard-on-unsupported-powerpc-macs.2232031/), that way we can try and have another year of Intel "support" 🤞
 

pshufd

macrumors G4
Oct 24, 2013
10,145
14,571
New Hampshire
Precisely, and this gives us a rough timeframe if history tells us anything; in the last two transitions (68k->ppc and ppc->intel), any new macOS version ditching a previously supported architecture has happened about 3 years after said architecture was dropped from all new models (not sure about 6502->68k though).

So we're probably looking at fall 2026 for an Apple silicon-only version of macOS; until then even though Intel macOS will ultimately be meant for 2019 Mac Pros only (that sold until spring last year), we're likely to have OCLP support for those versions on most EFI64 machines.

So yeah, come 2026 we should closely watch and keep an eye on betas coming out and try and examine intel content in the binaries and try and get them to work on Intel if possible, as was done years later for 10.6 and ppc (https://forums.macrumors.com/threads/snow-leopard-on-unsupported-powerpc-macs.2232031/), that way we can try and have another year of Intel "support" 🤞

And 2020 iMac 27s. I think that those were sold as late as 2022 at Costco. Microcenter still sells them new.
 

AdamBuker

macrumors regular
Mar 1, 2018
121
185
Precisely, and this gives us a rough timeframe if history tells us anything; in the last two transitions (68k->ppc and ppc->intel), any new macOS version ditching a previously supported architecture has happened about 3 years after said architecture was dropped from all new models (not sure about 6502->68k though).
The 6502-68k wasn't merely a processor transition but a complete shift from the Apple II line of computers to the Mac. Ever since the Mac's release in 1984, Apple focused more of it's marketing on the Mac line rather than the Apple II line despite it taking 4-5 years before the Mac would begin to outsell the Apple II. Apple never really announced any kind of formal transition away from the Apple II line. Rather they just let it die a slow death of neglect. No new models of the Apple II had been announced after 1988 with the Apple //c Plus. The IIgs had been on the market since 1986 with only minor changes. There were rumors of an updated version of that computer that would have included a built-in 3.5 floppy with an internal HD with support for standard DIMM ram expansion. As Apple neglected the development of the Apple II line, many schools in the 90's were starting to replace them with generic PC clones. To stem further market share losses, Apple introduced the IIe expansion card that could plug into the LC line of Macs so that educational customers would not have to completely abandon their Apple II software investments to move to the Mac. This was really about the closest thing that Apple did as far as formalizing any kind of transition from the Apple II to the Mac, but by the time of the card's discontinuation date in 1995 most consumers and businesses had already moved on to using PCs or Macs years ago. Not to mention that Apple had even started the transition to PowerPC by that point.
 

pshufd

macrumors G4
Oct 24, 2013
10,145
14,571
New Hampshire
The 6502-68k wasn't merely a processor transition but a complete shift from the Apple II line of computers to the Mac. Ever since the Mac's release in 1984, Apple focused more of it's marketing on the Mac line rather than the Apple II line despite it taking 4-5 years before the Mac would begin to outsell the Apple II. Apple never really announced any kind of formal transition away from the Apple II line. Rather they just let it die a slow death of neglect. No new models of the Apple II had been announced after 1988 with the Apple //c Plus. The IIgs had been on the market since 1986 with only minor changes. There were rumors of an updated version of that computer that would have included a built-in 3.5 floppy with an internal HD with support for standard DIMM ram expansion. As Apple neglected the development of the Apple II line, many schools in the 90's were starting to replace them with generic PC clones. To stem further market share losses, Apple introduced the IIe expansion card that could plug into the LC line of Macs so that educational customers would not have to completely abandon their Apple II software investments to move to the Mac. This was really about the closest thing that Apple did as far as formalizing any kind of transition from the Apple II to the Mac, but by the time of the card's discontinuation date in 1995 most consumers and businesses had already moved on to using PCs or Macs years ago. Not to mention that Apple had even started the transition to PowerPC by that point.

No mention of the Lisa?
 

AdamBuker

macrumors regular
Mar 1, 2018
121
185
No mention of the Lisa?
Well the Lisa is really notable for being the first computer Apple released that had a GUI, its $10k price tag, its buggy twiggy drive, and that it was a massive flop. It didn’t even have a chance to develop the massive user base that the Apple II did as it was discontinued in 1985. So many books and articles covering Apple history give the Lisa a similar amount of screen time as the Apple II line, but I would argue that the Apple II had much more of an impact than the former. The Apple II also financially carried the company through the failure of the Apple III, the Lisa, and the first few years of Mac sales which is really impressive considering how Apple itself tried to promote their other computer products well beyond anything it did for the Apple II.

I also think the open nature of the Apple II line in terms hardware and software design was anathema with the appliance model that the Mac represented. There’s still quite a dedicated community devoted to actively developing hardware and software for it.
 

AdamBuker

macrumors regular
Mar 1, 2018
121
185
One thing that does strike me as amusing is that the Arm processor took a massive amount of inspiration in its initial design from the 6502, particularly in the way it handled interrupts and aspects of memory addressing. Since there are no 32 or 64 bit versions of the 6502, the Arm architecture is probably the closest thing to a modern 6502. Or at least I like to pretend that it is.
 
  • Like
Reactions: spiderman0616

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Compounded on top of that, Apple dragged their feet on the "Mini Pro" (were still selling Intel Mini's through 2022) and Mac Pro.

Precisely, and this gives us a rough timeframe if history tells us anything; in the last two transitions (68k->ppc and ppc->intel), any new macOS version ditching a previously supported architecture has happened about 3 years after said architecture was dropped from all new models (not sure about 6502->68k though).

Precisely? You are quoting a counter example to what has historically happened. Apple finished their transitions relatively quickly the other two times. The Intel transition finished in about 18 months ; 6 months shorter than the self imposed deadline. It was not 6 months later than the deadline. So to say ' it is happening just like last time' is ungrounded. Apple missed the deadline badly ( at least on the Mini).

The large screen iMac's replacement didn't show up for 14 months after the initial wave shipped and almost 2 years after the announcement.

One reason the 'upper end' Intel Mini's stuck around so long is that the M1/M2 Mini was a substantial backslide in RAM capacity ( Intel 64GB ... which the M2 Pro variant still doesn't match. M3 Pro won't either at 36GB. ). That didn't happen historically either.

For the x86 transition Intel had a complete Processor line up for whatever Apple needed for a transition Mac. All of them. And a roadmap for the second , third generation. The Mac Pro is a bit 'stuck' not much differentiated from a Mac Studio Ultra in computational horsepower ( there are large I/O feature/value gap , but there is also more overlap). Apple really doesn't have a complete line up. There is a glaring "something bigger than Ultra' gap missing from Mac Pro.


In previous eras, Apple asked for users to pay for OS upgrades. In the current era Apple takes the upgrade money upfront with the system sales. There is a larger pot of money for upgrades because take the upgrade funds from every users , but it is also a fixed pot that is scheduled to be drained. That is a completely different funding scheme. There is no "just like last time" there either. If you take folks money to do work and then don't do the work .. that is usually a pretty good way to later get yourself sued. taking the money after you have done the work is a different entanglement.


The x86 user base is about an order of magnitude larger than the previous transitions ( was about 100M x86 Macs at transition). That is going to be pretty hard to 'burn off' in just 3 years of just upgrade sales. Hence why the legacy/obsolete 5-7 years countdown clock has shift down to a more strict 5. ( or less in the MP 2019 case since it was on extended lifecycle before its sales were terminated. )


In 2006-2008, Apple was expanding the ports of the Mach kernel from PPC to x86 and Arm . Going from 1 to 3 architectures. In 68K -> PPC it was a shift from 1 to 2 and back down to 1 for the OS core. Here for transition Apple was already on Arm. In fact, Arm was paying the lion's share of the kernel development funds. It was LARGER than the mac contribution. The filesytems had been merged ( APFS) years before the transition. 32-bit apps chucked out the windows years before transition. This trasition was going to a platform that Apple was already on. That looks nothing like the previous transitions. ( Newton and the 'prehistoric' Arm are nothing like the current situation between iOS/iPadOS/macOS shared infrastructure and macOS vs NewtonOS. )


for x86 Apple didn't own the core to Rosetta. In this iteration they do. So not like the 'last time' there either.



So we're probably looking at fall 2026 for an Apple silicon-only version of macOS; until then even though Intel macOS will ultimately be meant for 2019 Mac Pros only (that sold until spring last year), we're likely to have OCLP support for those versions on most EFI64 machines.

Apple isn't going to do macOS for just Mac Pro's only. It is too small of a base to do anything on its own. The MP 2013 went obsolete faster than the normal 5-7 schedule and Intel was still supported. The MP 2019 whose lifecycle was abnormally extended will likely die off short of 5 also. That is nothing particularly to do with the "3 years" history of previous transitions. It was already unusally 'old' when it get 'retired'. That is the root cause issue; not the x86 transition.
 
  • Like
Reactions: pshufd

pc297

macrumors 6502
Sep 26, 2015
336
207
Precisely? You are quoting a counter example to what has historically happened. Apple finished their transitions relatively quickly the other two times. The Intel transition finished in about 18 months ; 6 months shorter than the self imposed deadline. It was not 6 months later than the deadline. So to say ' it is happening just like last time' is ungrounded. Apple missed the deadline badly ( at least on the Mini).

I am talking about the time it took between the end of a transition and macOS not supporting the previous architecture. For PPC->Intel, this was about 3 years (Aug 2006; PM G5 discontinued - Aug 2009; 10.6 released). For m68k->PPC this was 2 1/2 years (April 1996; last m68k models discontinued - October 1998; MacOS 8.5 released). So we're just trying to have a rough guestimate as to when macOS will be Apple silicon-only here; "precisely" refers to the point that was being made, in that Apple dragged its feet when it came to discontinuing the Intel mini, not to a precise timescale. And yes the m68k->PPC transition took two years (March 1994 - April 1996).

One reason the 'upper end' Intel Mini's stuck around so long is that the M1/M2 Mini was a substantial backslide in RAM capacity ( Intel 64GB ... which the M2 Pro variant still doesn't match. M3 Pro won't either at 36GB. ). That didn't happen historically either.

Tell that to anyone who tried to use m68k apps on the 6100 - most early PPC systems were outperformed by Quadras back in the day as m68k apps stuck for a long time AND compatibility was achieved via emulation, not via a binary translation layer such as Rosetta. Let alone that the first Intel iMacs, minis and macbooks (pro) were 32-bit when the iMac and PowerMac G5 were 64-bit systems. Only the MP had an x64 CPU BUT kernel mode was 32-bit due to EFI32 and that didn't change until Pike's EFI patch long after Lion.

for x86 Apple didn't own the core to Rosetta. In this iteration they do. So not like the 'last time' there either.

Have they officially bought the rights from IBM - as in from PowerVM LX86, which IBM bought from Transitive Corp. as QuickTransit (which MS also used for x86 emulation in the 360, incidentally) which was licenced to Apple and which they subsequently adapted to do PPC->Intel (and now Intel->Arm)? All they likely had to do was to modify the existing source for the SPARC->x86 binary translation layer for PPC->Intel for Rosetta 1 and in all likelihood porting the x86->PPC layer to Arm changing DWORDs to QWORDs in the process to support x64 instructions. I guess that with a large chunk of the original QuickTransit team now working at Apple explains the seamless and rapid development of Rosetta2, but I don't know that they actually own the rights per se.

Incidentally, you'll be amazed that PowerVM LX86 actually works on G5 systems under SUSE ppc; for all we know Apple could very well have ported it to Mac OSX PPC but forwards compatibility isn't their middle name now is it (neither is backwards compatibility, all things considered). Point is who knows, with the right frameworks, we might be able to prolong Rosetta 2's lifecycle on Apple silicon macOS once they pull the plug on Intel+Intel binaries, much like it is actually possible to run 32-bit apps on Catalina with Mojave libs and frameworks or that it might very well have been possible to carry Rosetta over to Lion with the universal 10.6 libs and frameworks. All that I hope is that the community will try and get the first Apple silicon-only betas to work on Intel as they pour out; i.e. if they still have some Intel code in them then it might be possible as was done for 10.6 ppc and I will certainly take part in any such endeavour. The final 10.6.8 kernel still was x86+x64+ppc so let's hope this holds true come Apple silicon-only macOS.




In 2006-2008, Apple was expanding the ports of the Mach kernel from PPC to x86 and Arm.

Wasn't this (infamously) a BA thesis but in 2010 for Arm MacOSX? Even though fair enough, iOS is Darwin down to its core, but this would have been done beforehand (as in long before the iPhone was showcased in Jan 2007); and Marklar dates back to 2002...


Apple isn't going to do macOS for just Mac Pro's only. It is too small of a base to do anything on its own.

The Intel mini still sold until January last year and I'm sure you'll find that's a fairly common mac in terms of consumer market.
 
Last edited:

Basic75

macrumors 68020
May 17, 2011
2,095
2,446
Europe
In 2006-2008, Apple was expanding the ports of the Mach kernel from PPC to x86 and Arm. Going from 1 to 3 architectures. In 68K -> PPC it was a shift from 1 to 2 and back down to 1 for the OS core.
Of course NeXTSTEP (including Mach) was already cross platform as it had been running on 68k, x86, SPARC and PA-RISC. Apple probably never cared much for SPARC or PA-RISC, but, beyond this list already including 32bit x86, supporting such diverse processor architectures must have meant that the code base was generally quite portable. Surely a huge head start when adding ARM support. What AFAIK wasn't covered by NeXTSTEP was 64bit support. That might have been a can of worms, depending on how bad the code base was regarding anti-patterns like using int and long as interchangeable data types.
 
  • Like
Reactions: pc297

zorinlynx

macrumors G3
May 31, 2007
8,351
18,577
Florida, USA
The 6502-68k wasn't merely a processor transition but a complete shift from the Apple II line of computers to the Mac. Ever since the Mac's release in 1984, Apple focused more of it's marketing on the Mac line rather than the Apple II line despite it taking 4-5 years before the Mac would begin to outsell the Apple II. Apple never really announced any kind of formal transition away from the Apple II line. Rather they just let it die a slow death of neglect. No new models of the Apple II had been announced after 1988 with the Apple //c Plus. The IIgs had been on the market since 1986 with only minor changes. There were rumors of an updated version of that computer that would have included a built-in 3.5 floppy with an internal HD with support for standard DIMM ram expansion. As Apple neglected the development of the Apple II line, many schools in the 90's were starting to replace them with generic PC clones. To stem further market share losses, Apple introduced the IIe expansion card that could plug into the LC line of Macs so that educational customers would not have to completely abandon their Apple II software investments to move to the Mac. This was really about the closest thing that Apple did as far as formalizing any kind of transition from the Apple II to the Mac, but by the time of the card's discontinuation date in 1995 most consumers and businesses had already moved on to using PCs or Macs years ago. Not to mention that Apple had even started the transition to PowerPC by that point.

The death of the Apple II has always been an interesting topic for me. The truth is that by the time the early 90s rolled around, the Apple II was a dead-end platform. The architecture was just not suited to extending it any further without turning it into something else, and the "something else" already existed, the Mac.

As fond as I was of the Apple II at the time, I eventually realized it was over and it was time to move on to something new. My only wish is that Apple had just announced a definitive end for the Apple II and their intention to move their focus to the Mac a lot sooner, instead of letting it languish on the line and giving Apple II users false hope for several years.

The Apple IIc Plus shouldn't have existed. The IIgs should have been the end of the line. Maybe keep selling IIe to schools a bit longer since they had a big software library.

I know this is off-topic but I find this fascinating and can talk about it a lot. :)
 
  • Like
Reactions: Chuckeee

pc297

macrumors 6502
Sep 26, 2015
336
207
The death of the Apple II has always been an interesting topic for me. The truth is that by the time the early 90s rolled around, the Apple II was a dead-end platform. The architecture was just not suited to extending it any further without turning it into something else, and the "something else" already existed, the Mac.

As fond as I was of the Apple II at the time, I eventually realized it was over and it was time to move on to something new. My only wish is that Apple had just announced a definitive end for the Apple II and their intention to move their focus to the Mac a lot sooner, instead of letting it languish on the line and giving Apple II users false hope for several years.

The Apple IIc Plus shouldn't have existed. The IIgs should have been the end of the line. Maybe keep selling IIe to schools a bit longer since they had a big software library.

I know this is off-topic but I find this fascinating and can talk about it a lot. :)
Well at least that's the silver lining in this discussion; Apple did offer Apple II compatibility cards up to the LC575 and Apple II support until 1993, so this is at least one precedent where Apple did not immediately let owners of a widespread, discontinued architecture down in the past! But this was Apple before Jobs coming back, with the clones aroubd and Apple on course towards failing shortly thereafter as a result. And then again we hit this 3-year period between a model being discontinued (Apple IIc plus -1990) and official end of support (1993)
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
One thing that does strike me as amusing is that the Arm processor took a massive amount of inspiration in its initial design from the 6502, particularly in the way it handled interrupts and aspects of memory addressing. Since there are no 32 or 64 bit versions of the 6502, the Arm architecture is probably the closest thing to a modern 6502. Or at least I like to pretend that it is.
Sorry to rain on your parade, but arm64 just isn't that. It's a clean sheet redesign which abandoned everything that made arm32 weird, quirky, and vaguely related to the 6502. Since Apple's Arm chips don't support arm32 any more, there's not even a faint memory of 6502 influence.

(That's a good thing too. It's 2024, we've learned quite a few things about CPU ISA design since the 1970s, and 6502 just isn't a sound foundation to build on any more.)
 

AdamBuker

macrumors regular
Mar 1, 2018
121
185
Sorry to rain on your parade, but arm64 just isn't that. It's a clean sheet redesign which abandoned everything that made arm32 weird, quirky, and vaguely related to the 6502. Since Apple's Arm chips don't support arm32 any more, there's not even a faint memory of 6502 influence.

(That's a good thing too. It's 2024, we've learned quite a few things about CPU ISA design since the 1970s, and 6502 just isn't a sound foundation to build on any more.)
I don’t think you read my last sentence there.
 
  • Like
Reactions: jdb8167
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.