People seem to have gone wild about the new M1 processor being so much faster than Intel counterparts. The way I see it, that is just great, but it should have been expected.
First, because Apple processors have evolved over the years and the A14 inside both the iPhone 12 and the iPad Air has absolutely great single-core performance, better than high-end Intel processors. Nobody would reasonably expect Apple to release an ARM processor for the Mac with a slower single-core performance than the one inside the iPhone. And multi-core performance would of course be superior. So, everybody should have seen it coming.
Second, because, to go all the way to replace Intel processors, Apple should have something really great in their hands. And so it did. If the M1 was only 25% faster than the Intel processors, Apple would not have bothered to replace the chips inside the Macs, to develop Rosetta 2, to convince the developers to make the transition, and to run the risk of not having the possibility of running Windows on BootCamp. If a trillion-dollar company chose to make this risky move, it's because it had reason enough to fully believed in the success of the transition.
These two reasons alone were enough for me to expect something huge in terms of performance improvement. But there is a third reason that most people do not seem to be aware of. I am no computer programmer or hardware specialist, but I always had the impression that Intel's x86 architecture was just a bad one.
I remember how many different architectures there were back in the 1990s. The Motorola 68000 was a direct competitor to the x86, and many people said it was superior. The 6502 was said to be cheap and, although only 8-bit, performed more instructions per cycle than any other architecture. And then, in the 1990s, there were RISC architectures, the promise of the future. IBM PowerPC was powerful, and so was Sun's SPARC.
DEC introduced the Alpha, the first 64-bit architecture in 1992, and that was incredibly powerful, as I remember reading in magazines. But then Compaq bought DEC in 1998, and, being a company close to Intel, phased out Alpha. Compaq ultimately sold the Alpha architecture to Intel in 2001. So, it was game over for a major competitor.
Silicon Graphics was something else in the 1990s. Its computers were incredibly powerful, and it held a controlling interest in MIPS Technologies. MIPS architecture was very powerful, and it provided chips for both the Sony Playstation and the Nintendo 64, the consoles that would change the videogame landscape in the 1990s. But then, in 1998, Silicon Graphics decided to discontinue its line of MIPS mainstream processors, in favor of Intel's Itanium, which was supposed to be super-powerful. But Intel over-promised and under-delivered, and that, along with management issues, contributed to Silicon Graphics' bankruptcy.
As for the 68000, Motorola ceased the development in 1994, to replace it with the PowerPC. But IBM also did not have the scale to continue investing in the PowerPC in the 2000s to keep up with Intel, especially when Microsoft failed to support it.
So, as I watched it, many promising architectures came and went, and, at least as I have been told, they were all far superior to Intel's x86. But Intel prevailed over the years, and that was largely because it had the sheer luck to power the PC. As the PC became popular in the 1990s, Intel had the scale to invest in its poor architecture and to make individual processors cheaper. And any contender would have to face a major challenge: only Intel processors would run Windows, and such contender would have to develop a competing platform and convince users to migrate, which was unfeasible. So, Intel took down the competing architectures, one by one.
Then, two competitors emerged producing chips using x86 architecture that would run Windows as well. AMD and Cyrix were much smaller companies that managed to reverse-engineer the x86 and produce clones. And the clones were a serious challenge to Intel, which, not being very efficient at competing at its own arena even with much smaller and less powerful competitors, threatened to take down both in court.
So, Intel's dominance is because of circumstance and, I would dare to say, hardly by Intel's own merits. Now, Intel is a major global company and invests billions and billions of dollars in improving its weak architecture. Even though, it still gets headaches from AMD, which, although still much smaller, has managed to compete.
The popularization of the smartphone finally provided the scale for someone to invest and compete with the performance offered by Intel chips. Qualcomm was a small, virtually unknown, company, and suddenly, in the 2000s, started making chips for mobile phones. It grew into a multi-billion dollar business, and it dares to challenge Intel. Qualcomm Snapdragon processors even power Windows laptops and manage to emulate applications designed for Intel's architecture. Performance is far from being great, but that shows how competition became viable.
And, now, finally, we have someone to take Intel down. Phones provided the scale. And Apple certainly has the resources. If even AMD and Qualcomm can challenge Intel, you can imagine what Apple is capable of. In just a few years of developing its own architecture, Apple is able to trounce Intel. That speaks to how weak Intel's architecture is and to how inefficient Intel as a company is.
I am pretty sure that, once the door is open for ARM processors, other companies will take over Intel as well. Perhaps Qualcomm or Samsung or even NVIDIA (who knows, after it bought ARM) may have a hard time beating Apple in having the fastest processor of them all, but they should at least eat Intel for lunch. Intel's protection, which was the ubiquitousness of its architecture in PCs, seems to have fallen, and ARM seems a better alternative if properly funded.
I even created a poll so you can cast your vote, and we can see who thinks Intel x86 has ever been great to the point of manipulating the PC for so many years.
First, because Apple processors have evolved over the years and the A14 inside both the iPhone 12 and the iPad Air has absolutely great single-core performance, better than high-end Intel processors. Nobody would reasonably expect Apple to release an ARM processor for the Mac with a slower single-core performance than the one inside the iPhone. And multi-core performance would of course be superior. So, everybody should have seen it coming.
Second, because, to go all the way to replace Intel processors, Apple should have something really great in their hands. And so it did. If the M1 was only 25% faster than the Intel processors, Apple would not have bothered to replace the chips inside the Macs, to develop Rosetta 2, to convince the developers to make the transition, and to run the risk of not having the possibility of running Windows on BootCamp. If a trillion-dollar company chose to make this risky move, it's because it had reason enough to fully believed in the success of the transition.
These two reasons alone were enough for me to expect something huge in terms of performance improvement. But there is a third reason that most people do not seem to be aware of. I am no computer programmer or hardware specialist, but I always had the impression that Intel's x86 architecture was just a bad one.
I remember how many different architectures there were back in the 1990s. The Motorola 68000 was a direct competitor to the x86, and many people said it was superior. The 6502 was said to be cheap and, although only 8-bit, performed more instructions per cycle than any other architecture. And then, in the 1990s, there were RISC architectures, the promise of the future. IBM PowerPC was powerful, and so was Sun's SPARC.
DEC introduced the Alpha, the first 64-bit architecture in 1992, and that was incredibly powerful, as I remember reading in magazines. But then Compaq bought DEC in 1998, and, being a company close to Intel, phased out Alpha. Compaq ultimately sold the Alpha architecture to Intel in 2001. So, it was game over for a major competitor.
Silicon Graphics was something else in the 1990s. Its computers were incredibly powerful, and it held a controlling interest in MIPS Technologies. MIPS architecture was very powerful, and it provided chips for both the Sony Playstation and the Nintendo 64, the consoles that would change the videogame landscape in the 1990s. But then, in 1998, Silicon Graphics decided to discontinue its line of MIPS mainstream processors, in favor of Intel's Itanium, which was supposed to be super-powerful. But Intel over-promised and under-delivered, and that, along with management issues, contributed to Silicon Graphics' bankruptcy.
As for the 68000, Motorola ceased the development in 1994, to replace it with the PowerPC. But IBM also did not have the scale to continue investing in the PowerPC in the 2000s to keep up with Intel, especially when Microsoft failed to support it.
So, as I watched it, many promising architectures came and went, and, at least as I have been told, they were all far superior to Intel's x86. But Intel prevailed over the years, and that was largely because it had the sheer luck to power the PC. As the PC became popular in the 1990s, Intel had the scale to invest in its poor architecture and to make individual processors cheaper. And any contender would have to face a major challenge: only Intel processors would run Windows, and such contender would have to develop a competing platform and convince users to migrate, which was unfeasible. So, Intel took down the competing architectures, one by one.
Then, two competitors emerged producing chips using x86 architecture that would run Windows as well. AMD and Cyrix were much smaller companies that managed to reverse-engineer the x86 and produce clones. And the clones were a serious challenge to Intel, which, not being very efficient at competing at its own arena even with much smaller and less powerful competitors, threatened to take down both in court.
So, Intel's dominance is because of circumstance and, I would dare to say, hardly by Intel's own merits. Now, Intel is a major global company and invests billions and billions of dollars in improving its weak architecture. Even though, it still gets headaches from AMD, which, although still much smaller, has managed to compete.
The popularization of the smartphone finally provided the scale for someone to invest and compete with the performance offered by Intel chips. Qualcomm was a small, virtually unknown, company, and suddenly, in the 2000s, started making chips for mobile phones. It grew into a multi-billion dollar business, and it dares to challenge Intel. Qualcomm Snapdragon processors even power Windows laptops and manage to emulate applications designed for Intel's architecture. Performance is far from being great, but that shows how competition became viable.
And, now, finally, we have someone to take Intel down. Phones provided the scale. And Apple certainly has the resources. If even AMD and Qualcomm can challenge Intel, you can imagine what Apple is capable of. In just a few years of developing its own architecture, Apple is able to trounce Intel. That speaks to how weak Intel's architecture is and to how inefficient Intel as a company is.
I am pretty sure that, once the door is open for ARM processors, other companies will take over Intel as well. Perhaps Qualcomm or Samsung or even NVIDIA (who knows, after it bought ARM) may have a hard time beating Apple in having the fastest processor of them all, but they should at least eat Intel for lunch. Intel's protection, which was the ubiquitousness of its architecture in PCs, seems to have fallen, and ARM seems a better alternative if properly funded.
I even created a poll so you can cast your vote, and we can see who thinks Intel x86 has ever been great to the point of manipulating the PC for so many years.