What I worry is, while 85% of the market is dominated by the x86 base machine, why would all those big software companies want to join the party ? I'm not a software engineer, but I don't believe that's a easy job at all.
You're over-estimating the amount of work required to support different CPUs under the same operating system - which is
tiny compared to the effort needed to port applications between operating systems, and Apple has made it easy by providing the compiler technology, universal binaries etc. If a developer already thinks it is worth supporting the Mac then they will make the change - any that drop out because of ARM were
probably on the verge of giving up anyway. The recent switch from 32-bit to 64-bit was probably a bigger deal.
Of course, there will be a few exceptions where
part of an application is somehow dependent on Intel and has to be re-thought and maybe some log-jams where one developer depends on another developer making an ARM version of their library or plug in - and inevitably a few casualties - but Apple have pulled this off... let's see... 6502 -> 68k**, 68k->PPC, "Classic" MacOS -> OS X (forget CPU switches -
that was the biggie!), PPC -> x86-32, x86-32 -> x86-64...
five times, and although that doesn't guarantee that they can pull off #6 it is still a pretty good plausibility argument.
The Windows/x86 world is hamstrung by a huge, conservative corporate sector that expects their bespoke binaries from 1995 to run in 2020 - even the one big switch (from DOS/Windows to Windows NT) took years and NT had to bend over backwards to provide a compatibility layer, which still hasn't gone away. If it were just down to a few "Top Dogs" selling "public" applications then Windows on ARM, Alpha, Sparc... maybe even Itanium (although that had other problems) might already be a thing.
The Mac has "cleaned house" every 10 years or so - the legacy support expectation just isn't there to the same degree as windows.
(** I justify including this by comparison with Windows/x86, that has an almost unbroken legacy stretching back to the days of CP/M, a contemporary of the Apple ][)
Where ARM can indeed save some complexity compared to x86: decode logic unit (ARM instructions are fixed length and have more straightforward format) and no need to support legacy crap. x86 CPUs have to support all the obsolete stuff like 32bit ISA, x87 fp operations and so on. But it’s not like ARM. plus need to do less work here.
All that extra complexity takes up space on the die, uses power (=heat, and which rises rapidly with clock speed) which are
huge factors in CPU design (even desktop/workstation chips have to worry about heat dissipation). It's not about some race between the ARM and x86 instruction sets running in some theoretical scenario where everything else is equal (whatever
that means) - in the real world, as long as ARM can be driven faster for the same power & thermal output and can fit more cores or more GPU shaders, hardware codecs and other acceleration technologies in the same space as x86 then ARM will have the speed advantage.
The ARM wasn't designed as "just a mobile chip" - it was designed to kick Intel's backside around the room and the first chip in 1987 did just that (...the fact that it reputedly worked first time even though they'd forgot to connect the main power supply was just a bonus) but, back then, not running x86 binaries was a deal breaker outside of the embedded/mobile market - so it was forced into a niche. What's happening today is that software is predominantly written in high-level, CPU-independent code that "just" needs recompiling, "scripting" languages that ship as source code, or sometimes bytecode that runs on 'virtual' CPUs (java, Android, even MS's current C#/.net stuff) - while windows has (finally) got wiped out in the mobile sector and is facing serious competition from Linux and web technologies. So, non-Intel processors are "thinkable" once more and various groups are making desktop/server/workstation/suprecomputer-class ARM chips once more.