Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
On the whole, I agree with what Linus Torvalds has to say on the subject. His point being that there isn’t really a good path for Arm to win in the cloud/deployment space incrementally. The benefits of homogeneity make it difficult for alternatives to become competitive.

I completely agree with Linus on this. The biggest hurdle to ARM deployment is that the entire dev space runs on x86. Which is exactly my point - Apple just might have enough mass to change it. New Macs - thats your ARM development box right there. And if they perform well, other computer manufacturers will join.
 
Perhaps Apple have enough influence over the devops space to encourage this sort of shift, but it seems like a long shot to me. Microsoft is certainly working really hard right now with WSL to lure developers back to Windows and Linux on the desktop gets fractionally, marginally better each release. It’s anyone’s guess where things will land once the dust settles.

On the whole, I agree with what Linus Torvalds has to say on the subject. His point being that there isn’t really a good path for Arm to win in the cloud/deployment space incrementally. The benefits of homogeneity make it difficult for alternatives to become competitive.

edit to add: not sure why my link above isn’t working on mobile. If it doesn’t link to the post for you try:


WSL isn't as big a draw as Microsoft would have you believe though. According to this report, worldwide adoption is only around 150,000 people:


Part of this is because of the serious limitations of WSL, and Microsoft's insistence on controlling what exactly you can run through it (i.e., no GUI, limited functionality within the environment, no GPU access, etc.)


There's also multiple reports of Microsoft "Googlebombing" to pump up the reputation of WSL even in the wake of a major data-destroying bug in the September 2020 update:


In short, Microsoft is inflating the numbers to make WSL appear to be a more attractive alternative for developers than it is in practice.
 
WSL isn't as big a draw as Microsoft would have you believe though.

I’m just basing my comments on my own experience. We’ve got several developers at my company who are using Windows desktops and WSL with some success. They seem happy enough with it, and are able to build our toolchain without any more difficulty than our macOS and Linux based developers have. I’ve played around with it on my gaming machine at home, and it seems “ok” to me, but I admittedly haven’t pushed it very hard.

I considered it for my last upgrade, the hardware pricing is certainly alluring, but in the end I went with a Mac Pro despite the significantly higher costs. I hope I can get a three year buffer with the Mac Pro so that the macOS landscape will have settled (one way or the other) by the time this box is amortized and it’s time for my next upgrade. Minimally, I figure Linux will surely be better by then.

There’s basically a zero percent chance our production infrastructure will be anything other than x86 in that timeframe, although I don’t make those decisions so I couldn’t say for sure.
 
I completely agree with Linus on this. The biggest hurdle to ARM deployment is that the entire dev space runs on x86. Which is exactly my point - Apple just might have enough mass to change it. New Macs - thats your ARM development box right there. And if they perform well, other computer manufacturers will join.

I guess the dilemma we face at my company is how a developer could realistically advance that strategy. I don’t see a path for Arm to reach a critical mass of adoption incrementally. When would it ever make sense for a company to decide to transition their production infrastructure and their development environment all at once, en masse, in order to leap the migration hurdles involved? Arm would have to present a dramatically compelling advantage over x86, and that doesn’t really seem like where we are headed currently.
 
I guess the dilemma we face at my company is how a developer could realistically advance that strategy. I don’t see a path for Arm to reach a critical mass of adoption incrementally. When would it ever make sense for a company to decide to transition their production infrastructure and their development environment all at once, en masse, in order to leap the migration hurdles involved? Arm would have to present a dramatically compelling advantage over x86, and that doesn’t really seem like where we are headed currently.

It will most likely start with smaller companies and sub-projects. Incremental migration is possible in many cases - a lot of systems are modular. But I definitely agree with you that it won’t happen over night - if it happens at all.

By the way, talking about compelling advantages... one of them might be keeping using Macs :) Apple made it very clear that Intel Macs are end of the line. Developers who want to keep using Macs will start looking at ARM deployment.
 
Did you Google that just now?
Translation 'I got nothing.' :)

There’s basically a zero percent chance our production infrastructure will be anything other than x86 in that timeframe, although I don’t make those decisions so I couldn’t say for sure.

That doesn't surprise me as I have personally seen companies hold on to old ways even if it was way past their use by date. For example, one place I worked at was using a program that ran on a version of Unix that predated DOS (ie older then 1981) for no other reasons then 'it worked' and the company had paid a pretty penny for it back in the day. There was a change up at the upper levels and suddenly we were on the Windows 7 version with about 2 months of training.

Another example comes form my master's thesis (1990s) regarding cheap museum computerization. One of the museums was using a PC with a Intel 8088 to categorize the collection. The joke there was the computer shouldn't have been categorizing the collection but rather should have been in the collection. The only thing older was the museum that had a Kaypro but it was so old even DOS 1.0 wouldn't run on it and being a government museum they couldn't throw it away...and so they used it as a door stop.

As a side note, the software darling back then for museums was minsmy (sp) which was $100,000 a copy. There was a cheaper version at 10,000 but for small museums that was still too much. As a result I turned my thesis into a presentation (complete with booklet) for the Mountain Plains Museums Annual Meeting (10/05/1995) and Museum Association of Arizona Annual Meeting (05/19/1996) on behalf of my university, NMSU. Each 90-minute presentation turned into a 2 hour one each time as the museums representatives were desperate for cheap alternatives to what was available and I showed them how with off the shelf software both Mac and PC they could do what needed to be done without investing in some speciality software. I even wound up sending a copy off to a museum that had heard of the presentation but wasn't part of either of those two groups (IIRC it was in Hawaii of all places).

It was that experience that made me very wary of people saying they "need" some piece of specialty software.
 
Last edited:
Translation 'I got nothing.' :)

It wasn’t a serious suggestion so you didn’t get a serious response. Sorry if that disappoints you. I’m happy to have a conversation with a developer who is also invested in this technology, but it’s just not possible to have a productive discussion with a Google search result page. Livecode isn’t even the same class of software that we’ve been discussing here.
 
It wasn’t a serious suggestion so you didn’t get a serious response. Sorry if that disappoints you. I’m happy to have a conversation with a developer who is also invested in this technology, but it’s just not possible to have a productive discussion with a Google search result page. Livecode isn’t even the same class of software that we’ve been discussing here.
This just translates to either 'I can't provide real counter arguments' or 'I can't break it down so non experts can understand'.

The main reason I brought up LiveCode was it is the successor to Hypercard which was one of the pieces of software in my Mountain Plains Museums Annual Meeting (10/05/1995) and Museum Association of Arizona Annual Meeting (05/19/1996) presentations regarding how to computerize a museum on little to no budget.
 
Now I don't know how long Apple will support Swift's ability to generate x86 code but with Universal 2 I would guess a reasonable amount of time.

Swift is an open source project, built on LLVM and Clang which are open source projects. LLVM/Clang is used for all sorts of things. Google uses it to build Android and Chrome for Windows. Linux developers on x86 use LLVM and Clang for all sorts of things.

Apple won't be killing x86 support for Swift ever because that's made up of so many elements they don't directly control. They can't kill it. Swift and specifically the LLVM/Clang toolchain are used for so many x86 projects on so many platforms it will never die.

Heck, LLVM and Clang still support MIPS and PowerPC output. I think Itanium is even supported. Even though LLVM and Clang were started by Apple and Apple sponsored these tools support things way beyond the needs of Apple.

None of this has anything in any way to do with containers though. Swift always does fresh builds for the target CPU. Nothing it does runs programs in a container.
 
Last edited:
Perhaps, but an in-depth treatise on my work environment isn't really on topic for this thread, and @Maximara's combative attitude hasn't really led me to think that he has any actual curiosity or interest beyond just shouting down my experiences. I didn't see much point to spending the effort explaining my situation for someone who clearly doesn't actually care.

Of course I'm quite familiar with multi-arch Docker images. I actually do make use of them routinely. But the reality is that you don't have to stray very far in a Docker workflow to run into performance and compatibility challenges, even when dealing with an "interpreted" language like Python. We've had issues with Python 3 code resisting cross-platform operation with Tensorflow libraries (which are C accelerated) as well as intractable performance issues that make it unreliable to have developers building and testing on one platform for eventual deployment on another platform. My organization is using, producing, and deploying docker images written in Golang, Python, C++, and Haskell. Probably a few other languages I'm forgetting. All being deployed to an x64 Kubernetes infrastructure.

Often in production our developers have a need to download the exact containers that are running in the cluster to pull locally for dtrace and debugging runs, to nail down peculiar and difficult to diagnose bugs.

Even if 95% of things can be made to work predictably and reliably in a multi-architecture environment, the impact of that last 5% is enough to push an organization to consolidate on a single architecture. It's cheaper with much less risk.

Suggestions that we just re-write everything in Swift are not really tethered to reality.

Thank you. That was very helpful. As you say, it's often the last 5% where all the effort goes...
 
This just translates to either 'I can't provide real counter arguments' or 'I can't break it down so non experts can understand'.

There’s an even simpler explanation. I put as much energy into my response as you put into the post suggesting livecode. If I had any faith that you were genuinely interested in a conversation I’d be more willing to put in the effort.

Again, how can you not know this?!
 
There’s an even simpler explanation. I put as much energy into my response as you put into the post suggesting livecode. If I had any faith that you were genuinely interested in a conversation I’d be more willing to put in the effort.
You're protecting. I clearly stated why I used the Livecode example, I have shown there are ARM Docker components, and that Swift is an alternative (especially if you want to develop of iOS which is ~25$ of work marketshare)

I mean people don't take the time to write stuff like Xcode for Windows (12 Ways to Build iOS Apps on PC) unless there is a viable reasonalbe profitable venture for doing it.
 
Last edited:
You're protecting. I clearly stated why I used the Livecode example, I have shown there are ARM Docker components, and that Swift is an alternative (especially if you want to develop of iOS which is ~25$ of work marketshare)

I mean people don't take the time to write stuff like Xcode for Windows (12 Ways to Build iOS Apps on PC) unless there is a viable reasonalbe profitable venture for doing it.

My job does not in any way involve writing user facing apps. Your advice is perhaps well intentioned, but is not helpful. I’d also note that in all of our dealings here in this thread you have not actually asked me a single question, only demanded that I explain why I disagree with you. I don’t owe you any explanations and you don’t actually appear to care. I feel like you are just treating this thread as verbal combat, and that is not something I will be spending my time on. You’ve only been aggressive and inflammatory with me.

You don’t want to learn, you just want to feel like you won.

Cheers.
 
Last edited:
My job does not in any way involve writing user facing apps. Your advice is perhaps well intentioned, but is not helpful. I’d also note that in all of our dealings here in this thread you have not actually asked me a single question, only demanded that I explain why I disagree with you. I don’t owe you any explanations and you don’t actually appear to care. I feel like you are just treating this thread as verbal combat, and that is not something I will be spending my time on. You’ve only been aggressive and inflammatory with me.

Found this in another thread:
Intel has been kind of a mess. Right now they support 128, 256, and 512 bits. Some architectures only support 256. Very old architectures only support 128, but some of them still exist in inexpensive units that are sold today. The 128 bit types have 2 types of instructions, the newer vex prefixed 3 operand ones and the older 2 operand ones. Intrinsics do an imperfect job of handling this, as the correspondence between vex prefixed and non-prefixed instructions isn't perfect, even for 128 bit widths.

This is the major downside of all that back compatibility everyone is on about. ARM allows a break from all that insanity and a streamlining of code. Also with PC hardware margins as thin as they are ARM allows one to reduce (or one would hope eliminate) all the bloat and adware the average PC ships with. With the desktop market effectively stagnant the money is in the laptop/mobile market and for the mobile market ARM is where it is at be it Android or iOS. And in the laptop space ARM has some clear advantages over Intel. Heck, I'm not even sure the desktop as we know it today is long for this world.

I am reminded of when Blockbuster turned down an option to buy Netflix. Which of those two companies is still around today? Sometimes you have to look beyond the here and now and realize the market you are aiming for has a limited shelf life and take steps to make the transition. Otherwise you risk joining the likes of Blockbuster.
 
Last edited:
Now I don't know how long Apple will support Swift's ability to generate x86 code but with Universal 2 I would guess a reasonable amount of time.

Are we talking the build for Darwin or the OSS project? Since Swift uses LLVM for machine code generation, it’s also more about what Apple’s build of LLVM supports, rather than Swift itself.

I’d imagine that Apple’s tool chain will stop building the x86 back end around the time they drop support fully for x86 and OS releases stop coming out for x86. The OSS projects on the other hand won’t go in to yank all this stuff out. There’s been surprising community efforts to enable support for platforms (hosting and targeting) that Apple honestly doesn’t care about. Apple accepts those changes today, and the folks running the OSS project itself see Swift as a language front-end to clang and LLVM. It’s a little weird having to juggle Apple’s needs and supporting the community adding support for Windows, Android, PowerPC (not Mac) and even stuff like the Raspberry Pi. But they do it.
 
Apple accepts those changes today, and the folks running the OSS project itself see Swift as a language front-end to clang and LLVM. It’s a little weird having to juggle Apple’s needs and supporting the community adding support for Windows, Android, PowerPC (not Mac) and even stuff like the Raspberry Pi. But they do it.

Apple would have their own configuration, since LLVM is effectively just a set of compiler libraries and Clang is highly modular in itself. Apple isn't the only sponsor though.


Also bits of intel's influence have made their way into the LLVM IR. The following are almost copies of the intel equivalents, and intel is the only one that uses this particular implementation. It's partly a side effect of using 256 bit variables with only 16 available names. They try to fuse the first or last instruction with a memory operation.

 
I like that line! permission to use it for similar internet “debates”?

@Maximara - my comment wasn't directed at you...so please don't take offence :) I just liked the comment because we all know it's often true.

We are all familiar with how online discussion can degenerate into a "contest" to prove a point....I'm as guilty as anyone as enjoying the satisfaction of being seen to be "right". But we would all do better to listen to other opinions, however incorrect or misguided, and try to make the experience and educational one.
 
@Maximara - my comment wasn't directed at you...so please don't take offence :) I just liked the comment because we all know it's often true.

We are all familiar with how online discussion can degenerate into a "contest" to prove a point....I'm as guilty as anyone as enjoying the satisfaction of being seen to be "right". But we would all do better to listen to other opinions, however incorrect or misguided, and try to make the experience and educational one.
That last part is so important. It is why I provided links/references to videos/articles to support my position. Perhaps LiveCode and Swift are not the best options for x86/ARM dual binarires or the ARM version of the Docker resource pool (for lack of a better name) is not as large as the Intel but trying to effectively ignore their existence is not going to change the fact they do exist as ways to transfer from x86 to ARM.

Sure they are kind of useless for internally made or highly niche software but my view is with the programs now available is doing such things in house or using such highly niche software really that good an idea anymore?
 
Last edited:
Apple would have their own configuration, since LLVM is effectively just a set of compiler libraries and Clang is highly modular in itself. Apple isn't the only sponsor though.

When talking about LLVM on Darwin yes. I did mention that.

I was talking about the Swift compiler in this quoted passage though. That project is run by Apple, and my point is that Swift is envisioned as another front end to LLVM, not *just* a language for Apple platforms. Although Apple’s priority is in the language itself and Apple platforms, along with Linux as the “other platform core engineers must not break”.
 
  • Like
Reactions: thekev
Sure they are kind of useless for internally made or highly niche software but my view is with the programs now available is doing such things in house or using such highly niche software really that good an idea anymore?

The vast, vast, vast majority of software developers are writing software which is not intended for end users to run. When it comes to lines-of-code written, or the number of employed software developers it's not even remotely close. Internal, back-end, and what you call "niche" development is in fact the overwhelming majority.

Swift and Livecode are perfectly fine platforms if you want to take an application you've written and produce it for multiple platforms among a user base that has different kinds of devices or operating systems. That's the opposite of what my job is (and the majority of software developers working today). I have a homogenous, singular, unified "market" which is our internal production infrastructure and a developer environment which is chosen to be compatible with the production infrastructure. Any discussion of cross-platform development is the reverse of what you are imagining it to be. We would need to introduce cross-platform development not in the service of a varied deployment target, but rather to support variety among developer environments. There's a strong economic and business case to simply avoid that risk and expense by mandating x86 for developer machines. This means that Apple Silicon Macs will not be suitable for our developers because without performant x64 emulation there's no reasonable way to incorporate them into the development workflow we use. Any efforts to do so would incur risks and costs which far outweigh whatever marginal benefit we'd receive from those aspects of Arm which are technically superior to x64.

When it comes to cloud and in house production systems, x64 enjoys a near universal monopoly. Arm cloud resources are available, but they're like a fart in a tornado when it comes to mindshare and marketshare. That's not going to change any time soon because there's just not any pressure to do so.

Nobody is ignoring the tools you have discovered in your hurried armchair research. They're just not relevant to the argument you're trying to keep alive. You seem to have a strong emotional connection to this otherwise anodyne discussion about CPU architectures. I'd hope that most of us do not share this confounding influence. I don't have any loyalty or fondness for any CPU instruction set. I just have a job to do and a desire to choose the tools which place the fewest barriers in between me and success. Arm is not the right tool today.

Be curious, not judgmental.
 
  • Like
Reactions: johngwheeler
The vast, vast, vast majority of software developers are writing software which is not intended for end users to run. When it comes to lines-of-code written, or the number of employed software developers it's not even remotely close. Internal, back-end, and what you call "niche" development is in fact the overwhelming majority.

Swift and Livecode are perfectly fine platforms if you want to take an application you've written and produce it for multiple platforms among a user base that has different kinds of devices or operating systems. That's the opposite of what my job is (and the majority of software developers working today). I have a homogenous, singular, unified "market" which is our internal production infrastructure and a developer environment which is chosen to be compatible with the production infrastructure. Any discussion of cross-platform development is the reverse of what you are imagining it to be. We would need to introduce cross-platform development not in the service of a varied deployment target, but rather to support variety

We have seen DRM methods so sensitive that simply changing the graphic card is enough to set them off. Let's face it there is a lot of variety within the x86 architecture so I seriously doubt a truly "homogenous, singular, unified" market is anything but niche.

Case in point is Apple's old Star Trek project. One of the many reasons if failed was it was just too sensitive to hardware variety and would only run on a "homogenous, singular, unified" set of hardware.
 
We have seen DRM methods so sensitive that simply changing the graphic card is enough to set them off. Let's face it there is a lot of variety within the x86 architecture so I seriously doubt a truly "homogenous, singular, unified" market is anything but niche.

Case in point is Apple's old Star Trek project. One of the many reasons if failed was it was just too sensitive to hardware variety and would only run on a "homogenous, singular, unified" set of hardware.

You're really reaching here. Stop trying to Gish Gallop me and just ask questions if you're curious. You're still looking at this subject from the opposite direction. Against my better judgement, I put time and energy into a thoughtful response and you apparently just skimmed it for keywords so that you could attack back with some thin, specious argument. It's disappointing.
 
Last edited:
You're really reaching here. Stop trying to Gish Gallop me and just ask questions if you're curious. You're still looking at this subject from the opposite direction. Against my better judgement, I put time and energy into a thoughtful response and you apparently just skimmed it for keywords so that you could attack back with some thin, specious argument. It's disappointing.
Its that "without performant x64 emulation" that stood out in all that. There is Rosetta 2 which is not "emulation" in the way the word tends to be used but rather translates x86_64 processor instructions to something the AS can use. Its akin to the difference between Sheepscaver to run old PowerPC code vs VirtualBox which doesn't change the CPU calls (if the replies to it being able to do its magic on the coming ARM Macs is any guide)
 
Its that "without performant x64 emulation" that stood out in all that. There is Rosetta 2 which is not "emulation" in the way the word tends to be used but rather translates x86_64 processor instructions to something the AS can use. Its akin to the difference between Sheepscaver to run old PowerPC code vs VirtualBox which doesn't change the CPU calls (if the replies to it being able to do its magic on the coming ARM Macs is any guide)
How does this relate to the subject we are discussing? Be specific.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.