Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They must have let 32-bit console app to run. remove GUI libraries but keep main sys libs for 32-bit. Will stop upgrade for lots of development places
 
They must have let 32-bit console app to run. remove GUI libraries but keep main sys libs for 32-bit. Will stop upgrade for lots of development places

When you have a minute. I'd like to get some more detail on that. there's no issue as of now with Xcode or intel compiler/code studio running in pure 64 mode.
 
Last edited:
Here's why Apple is cutting 32 bit apps.

All 32 bit apps run on the 10.4 Tiger runtime. A simplified explanation of what that means is that when you are loading a 32 bit application, you're running a chunk of Tiger to support that application. That means any 32 bit code, including Apple's own 32 bit libraries, are written in Tiger era code.

The Tiger runtime is ancient and there are a lot of features it doesn't support. Obj-C 2.0 came out in 2006 and has a lot of nice features that aren't supported well in the 32 bit/Tiger runtime. Obj-C 1.0 applications also don't do well with changes to libraries, which means Apple has to be extra careful when maintaining the 32 bit libraries. It adds limits to how much Apple can upgrade Cocoa.

64 bit bit applications are based on the Leopard 10.5 runtime. It may not sound like much, but a lot was added in 2 years, and it's a much easier runtime to upgrade and support. Carbon is also 32 bit only, which is another thing Apple would like to drop.

Could Apple build a modern 32 bit runtime? Yes. But changing the runtime badly breaks applications, so every 32 bit application would have to be recompiled anyway, which mostly defeats the point.

Dropping the Tiger runtime means the entire OS only has to run the Leopard runtime, which also means the OS uses less RAM and your computer might even run a bit faster. In Mojave, if you mix 64 bit and 32 bit apps, both runtimes and their libraries have to sit in RAM at the same time. It's like running a an extra WINE-like layer to run 32 bit apps.

It's like Rosetta back in the day. Maybe Apple should have put in the work to keep 32 bit, but there are a lot of reasons to drop it. Microsoft does something similar on Windows called "Windows 32-bit on Windows 64-bit" which grafts older Windows on top of newer Windows. They've been willing to do the work, but they have the same issues to deal with.

They recently introduced a Mojave based runtime for Swift applications, and ARM apps could also get a newer runtime locked to whatever OS version adds ARM apps.

Fun note: 64 bit PowerPC apps used the 10.5 Leopard runtime, which actually made them more modern than the 32 bit Intel apps on the Tiger runtime.
 
Last edited:
Here's why Apple is cutting 32 bit apps.

All 32 bit apps run on the 10.4 Tiger runtime. A simplified explanation of what that means is that when you are loading a 32 bit application, you're running a chunk of Tiger to support that application. That means any 32 bit code, including Apple's own 32 bit libraries, are written in Tiger era code.

The Tiger runtime is ancient and there are a lot of features it doesn't support. Obj-C 2.0 came out in 2006 and has a lot of nice features that aren't supported well in the 32 bit/Tiger runtime. Obj-C 1.0 applications also don't do well with changes to libraries, which means Apple has to be extra careful when maintaining the 32 bit libraries. It adds limits to how much Apple can upgrade Cocoa.

64 bit bit applications are based on the Leopard 10.5 runtime. It may not sound like much, but a lot was added in 2 years, and it's a much easier runtime to upgrade and support. Carbon is also 32 bit only, which is another thing Apple would like to drop.

Could Apple build a modern 32 bit runtime? Yes. But changing the runtime badly breaks applications, so every 32 bit application would have to be recompiled anyway, which mostly defeats the point.

Dropping the Tiger runtime means the entire OS only has to run the Leopard runtime, which also means the OS uses less RAM and your computer might even run a bit faster. In Mojave, if you mix 64 bit and 32 bit apps, both runtimes and their libraries have to sit in RAM at the same time. It's like running a an extra WINE-like layer to run 32 bit apps.

It's like Rosetta back in the day. Maybe Apple should have put in the work to keep 32 bit, but there are a lot of reasons to drop it. Microsoft does something similar on Windows called "Windows 32-bit on Windows 64-bit" which grafts older Windows on top of newer Windows. They've been willing to do the work, but they have the same issues to deal with.

They recently introduced a Mojave based runtime for Swift applications, and ARM apps could also get a newer runtime locked to whatever OS version adds ARM apps.

Fun note: 64 bit PowerPC apps used the 10.5 Leopard runtime, which actually made them more modern than the 32 bit Intel apps on the Tiger runtime.

and I will add two more hardware points to this. the T2 is a fact of life in the Mac going forward. It is a modified A10 class chip and as such -64 bit (and only 64) If no-one hasn't seen the bridgeOS error in their log, they are gifted or providence is upon then. Why? Because so long as there is context switching going on between two separate System level chips and one controls processing and the other actually controls I/O/specialized Video processing.Audio etc now add a software OS layer and then a 32 bit subsystem and turn on the mixer and wait for the blades to lock on a frozen process/context switch...that context switch has the likelihood and promise of an error - its not if, it's when it will occur. Second point, the strategy with Metal hinges on having large pipelines and data paths to function as designed. If you pop up one of the WDC demos, in particular the ray trace one and run it as 32 vs 64 mode true - the performance is night and day, with the 32 behaving live POV and the 64 behaving like a midrange RTX class video output. It's a software/.hardware strategy.
 
Here's why Apple is cutting 32 bit apps.

All 32 bit apps run on the 10.4 Tiger runtime. A simplified explanation of what that means is that when you are loading a 32 bit application, you're running a chunk of Tiger to support that application. That means any 32 bit code, including Apple's own 32 bit libraries, are written in Tiger era code.

The Tiger runtime is ancient and there are a lot of features it doesn't support. Obj-C 2.0 came out in 2006 and has a lot of nice features that aren't supported well in the 32 bit/Tiger runtime. Obj-C 1.0 applications also don't do well with changes to libraries, which means Apple has to be extra careful when maintaining the 32 bit libraries. It adds limits to how much Apple can upgrade Cocoa.

64 bit bit applications are based on the Leopard 10.5 runtime. It may not sound like much, but a lot was added in 2 years, and it's a much easier runtime to upgrade and support. Carbon is also 32 bit only, which is another thing Apple would like to drop.

Could Apple build a modern 32 bit runtime? Yes. But changing the runtime badly breaks applications, so every 32 bit application would have to be recompiled anyway, which mostly defeats the point.

Dropping the Tiger runtime means the entire OS only has to run the Leopard runtime, which also means the OS uses less RAM and your computer might even run a bit faster. In Mojave, if you mix 64 bit and 32 bit apps, both runtimes and their libraries have to sit in RAM at the same time. It's like running a an extra WINE-like layer to run 32 bit apps.

It's like Rosetta back in the day. Maybe Apple should have put in the work to keep 32 bit, but there are a lot of reasons to drop it. Microsoft does something similar on Windows called "Windows 32-bit on Windows 64-bit" which grafts older Windows on top of newer Windows. They've been willing to do the work, but they have the same issues to deal with.

They recently introduced a Mojave based runtime for Swift applications, and ARM apps could also get a newer runtime locked to whatever OS version adds ARM apps.

Fun note: 64 bit PowerPC apps used the 10.5 Leopard runtime, which actually made them more modern than the 32 bit Intel apps on the Tiger runtime.

and I will add two more hardware points to this. the T2 is a fact of life in the Mac going forward. It is a modified A10 class chip and as such -64 bit (and only 64) If no-one hasn't seen the bridgeOS error in their log, they are gifted or providence is upon then. Why? Because so long as there is context switching going on between two separate System level chips and one controls processing and the other actually controls I/O/specialized Video processing.Audio etc now add a software OS layer and then a 32 bit subsystem and turn on the mixer and wait for the blades to lock on a frozen process/context switch...that context switch has the likelihood and promise of an error - its not if, it's when it will occur. Second point, the strategy with Metal hinges on having large pipelines and data paths to function as designed. If you pop up one of the WDC demos, in particular the ray trace one and run it as 32 vs 64 mode true - the performance is night and day, with the 32 behaving live POV and the 64 behaving like a midrange RTX class video output. It's a software/.hardware strategy.

Well said, both of you.
Then, for customers, their choice is simple: either hang on with old software and/or old hardware because they know software update/upgrade is not an option, or move to another platform (Windows/Linux) for 32-bit alternative. I will be the one who keeps using old macOS and MacBook for as long as I can. Everyone is happy after all, I think.
 
  • Like
Reactions: Fishrrman
nothing 32bit will run, there's simply no architecture to support it.
Technically this is not true, but for practical purposes yes.

x86_64, amd64 or whatever you call it is an extension to the "32-bit" i386 instruction set architecture. How it works is that it puts the CPU in a different mode, where most instructions behave and looks the same as in 32 bit, the difference however are more registers and of course that they now are 64 bit. But instructions looks and works the same, except for a few which have slight different behavior.

So why are there different versions of software then? Different calling conventions are the main culprit. Calling conventions are how subroutines call, pass, and return variables and data. To take advantage of the extra registers in the 64 bit and thus speed up calls by avoiding expensive write to memory, as 32 bit calling conventions generally passed most data on the stack instead, new ABI for 64 bit with new type of calling conventions are implemented.

This means that there is incompatibility between binaries, even though the instruction set architecture is the same. Technically one could say that if the piece of software do not call any external functions, the assembly code will work whether in 32 bit or 64 bit if it was 32 bit code (there are a few instructions which behave a bit different in 64 bit, and some are noops, so there are some caveats which makes this exercise theoretical).

How it will be refused to run in practice is that the OS flags what is 32 bit and what is 64 bit, so pieces that are 32 bit will be flagged as not executable, but you could copy it into memory and JMP to the code.
 
Personally, I am on the stance, you use what works for you and you buy equipment and invest in what meets the goals.
If it's 16/32/64 windows or Mac or whatever platform. do the research, find what meets the goal and use that. I am not advocating ditching anything for another thing just for progress sake. I have a lot of faith in this strategy they are following right now because it will provide the capabilities that will suit my goals and it's not vaporware - they have been able to demonstrate where and how it will happen - not project what could be. That's unfortunately a rarity in the current tech sector right now and I applaud it.
 
  • Like
Reactions: Shirasaki
Technically this is not true, but for practical purposes yes.

x86_64, amd64 or whatever you call it is an extension to the "32-bit" i386 instruction set architecture. How it works is that it puts the CPU in a different mode, where most instructions behave and looks the same as in 32 bit, the difference however are more registers and of course that they now are 64 bit. But instructions looks and works the same, except for a few which have slight different behavior.

So why are there different versions of software then? Different calling conventions are the main culprit. Calling conventions are how subroutines call, pass, and return variables and data. To take advantage of the extra registers in the 64 bit and thus speed up calls by avoiding expensive write to memory, as 32 bit calling conventions generally passed most data on the stack instead, new ABI for 64 bit with new type of calling conventions are implemented.

This means that there is incompatibility between binaries, even though the instruction set architecture is the same. Technically one could say that if the piece of software do not call any external functions, the assembly code will work whether in 32 bit or 64 bit if it was 32 bit code (there are a few instructions which behave a bit different in 64 bit, and some are noops, so there are some caveats which makes this exercise theoretical).

How it will be refused to run in practice is that the OS flags what is 32 bit and what is 64 bit, so pieces that are 32 bit will be flagged as not executable, but you could copy it into memory and JMP to the code.

You are correct, but there is no meaningful use for it. as per your own statement "x86_64, amd64 or whatever you call it is an extension to the "32-bit" i386 instruction set architecture. How it works is that it puts the CPU in a different mode, where most instructions behave and looks the same as in 32 bit, the difference however are more registers and of course that they now are 64 bit. But instructions looks and works the same, except for a few which have slight different behavior."

They could do this but it would again be a meaningless hack with no purpose and additional support requirements and costs. You're posing the DOS argument and that's been settled a long time ago. The slight differences you mention are still context switch limited and unnecessary at this point for stability and performance purposes. Again, see the depreciation list and the ongoing list of 32 bit subsystems being replaced or refreshed to newer ones as with AVFoundation vs quicktime. The time to make this argument was maybe 2010, it's 10 years later and there has been ample opportunity for developers to adapt to those changes.
 
Last edited:
it's 10 years later and there has been ample opportunity for developers to adapt to those changes.
Hard to believe after over a decade, 32-bit is still fairly strong across the board. Maybe back in 2006 when AMD and Intel cut 32-bit hardware support, we would not have this discussion today.
 
Hard to believe after over a decade, 32-bit is still fairly strong across the board. Maybe back in 2006 when AMD and Intel cut 32-bit hardware support, we would not have this discussion today.

That’s the thing. There is no discussion to be had, Catalina is the bandaid being ripped off and a fresh start on the architecture. Leaving the devs behind that aren’t willing to do basic updates and recompile’s isn’t a high price to pay for the support and eco system improvements it will afford. It’s why I have always advocated it’s the customers responsibility to force the vendors to provide updates - it is their responsibility, not apples.
 
That’s the thing. There is no discussion to be had, Catalina is the bandaid being ripped off and a fresh start on the architecture. Leaving the devs behind that aren’t willing to do basic updates and recompile’s isn’t a high price to pay for the support and eco system improvements it will afford. It’s why I have always advocated it’s the customers responsibility to force the vendors to provide updates - it is their responsibility, not apples.
Indeed, though here comes the “if it ain’t break don’t fix it” mindset.
Given there are still many people not even realising what generation their phone is, I don’t have high hope on customer pushing software vendors to update their software to support 64-bit natively.
Are we architecting our own “demise”? I Have no idea.
 
Indeed, though here comes the “if it ain’t break don’t fix it” mindset.
Given there are still many people not even realising what generation their phone is, I don’t have high hope on customer pushing software vendors to update their software to support 64-bit natively.
Are we architecting our own “demise”? I Have no idea.

Short answer is yes. Longer answer is that nothing forces anyone to abandon any of what they use now all that has to happen is stay at that and continue as is. There was a similar debate about dos and windows 25 or so years ago. Microsoft bent the knee and their eco system suffered another 20 years before they forced the issue and moved on. This isn’t a surprise change but a shorter envelope 10 years vs 20
 
  • Like
Reactions: glazball
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.