Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Kostask

macrumors regular
Jul 4, 2020
230
104
Calgary, Alberta, Canada
What does "run Swift natively" even mean? Swift already runs natively. They could improve some parts of the cpu to speed up some of the instructions mostly used by Swift. But "run Swift natively" makes no sense.

This is what I explained above. The CPU's instruction set IS Swift. No more compiling, no more assembly language, just Swift. This has massive implications for the entire architecture. The first, and easy/minor part is to take the instruction decoder and replace it with something that can use Swift as the input language. I don't think this has been done before, but I could be wrong on that point. This is also why it is going to take a long, long time to come about, and why it is in advanced development. It is a concept right now, don't expect this in the second batch of AS Macs.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
This is what I explained above. The CPU's instruction set IS Swift. No more compiling, no more assembly language, just Swift. This has massive implications for the entire architecture. The first, and easy/minor part is to take the instruction decoder and replace it with something that can use Swift as the input language. I don't think this has been done before, but I could be wrong on that point. This is also why it is going to take a long, long time to come about, and why it is in advanced development. It is a concept right now, don't expect this in the second batch of AS Macs.

How is that going to work practically? Running a high-level language "directly" is not feasible, it's a poor match to how the hardware actually works. High-level languages rely on optimizers to make the code run fast. In your model you are essentially delegating the compiler work to the CPU. This can be done, but I just don't see the benefit. The entire thing will be dead slow and inflexible. There is a reason why CPUs run specialized instruction sets and not high-level languages.
 

Kostask

macrumors regular
Jul 4, 2020
230
104
Calgary, Alberta, Canada
Those points are valid, and are probably what Apple is figuring out, and why it is going to take another decade, at the very least, to come up with something.

Note, this is how hardware works, NOW. Our CPUs run machine language and compilers and optimizers because it allows CPUs to run most efficiently, not for developers or user's benefit. Its not necessarily going to work that way in 10 years.

Lets not even speculate on what the ML/AI engine capabilities will be in 10 years, because there is no real way to know how quickly they will advance, but we do know they will advance.

The model is NOT mine. This is what I have heard. It could be a lot like other advanced Apple projects that never see the light of day, or it could turn into something. The prototype of the iPad existed in the 1980s, and look how long it took for that to turn into a real product. How many systems had touch screens in the 1980s?
 
Last edited:

wardie

macrumors 6502a
Aug 18, 2008
551
179
How many systems had touch screens in the 1980s?
Hey don’t diss the CASIO PB-1000 with 16 LCD touchscreen keys... iPad Pro eat your heart out.
6026d070012bd542b8653f0d9f7badbc.jpg
 
  • Haha
Reactions: Boyd01

wardie

macrumors 6502a
Aug 18, 2008
551
179
I think it is safe to sau that there is a reason it took so long for the iPad to come out.

Cool unit. If it is yours, does it still work?

Haven’t got a clue, I remember my dad an engineer liked gadgets and had one, and I’m not young now :) Teenage nerdy boy wow factor I guess.
 

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
Those points are valid, and are probably what Apple is figuring out, and why it is going to take another decade, at the very least, to come up with something.

Note, this is how hardware works, NOW. Our CPUs run machine language and compilers and optimizers because it allows CPUs to run most efficiently, not for developers or user's benefit. Its not necessarily going to work that way in 10 years.

Lets not even speculate on what the ML/AI engine capabilities will be in 10 years, because there is no real way to know how quickly they will advance, but we do know they will advance.

The model is NOT mine. This is what I have heard. It could be a lot like other advanced Apple projects that never see the light of day, or it could turn into something. The prototype of the iPad existed in the 1980s, and look how long it took for that to turn into a real product. How many systems had touch screens in the 1980s?

Native Swift support sounds “fancy” but it also sounds really limiting.

Having native support for fixed functions make sense. HEVC is a very well known quantity, and if anyone tries changing anything major in it BOOM lost hardware accelerated support.

If you do something like that for the actual underlying language, the hardware is now limited based on the version of the language it was architected for. And to what end?

Maybe I’m understanding what this whole “native swift” actually means.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
This is what I explained above. The CPU's instruction set IS Swift. No more compiling, no more assembly language, just Swift. This has massive implications for the entire architecture. The first, and easy/minor part is to take the instruction decoder and replace it with something that can use Swift as the input language. I don't think this has been done before, but I could be wrong on that point. This is also why it is going to take a long, long time to come about, and why it is in advanced development. It is a concept right now, don't expect this in the second batch of AS Macs.

That‘s a terrible idea from a performance perspective. That’s like going from RISC to CISC, then deciding that you haven’t made things inefficient enough, so you go an extra 300 yards along the highway of bad ISA architecture design.

Even UltraJava didn’t really run java as its ISA.
 

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
That‘s a terrible idea from a performance perspective. That’s like going from RISC to CISC, then deciding that you haven’t made things inefficient enough, so you go an extra 300 yards along the highway of bad ISA architecture design.

Even UltraJava didn’t really run java as its ISA.

I expect blow back, but whatever.

I remember hearing about processors that handled Java bytecode natively...

I’m happy to be wrong... trust me I want to be wrong.

Limiting a processor to this just looks like a bad idea.

I can see a processor optimized for very low level code, and you can optimize and compile for that.

High level code? Trust me, I adore Rene Rictchie, but it doesn’t make sense for a fixed function thing.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
I expect blow back, but whatever.

I remember hearing about processors that handled Java bytecode natively...

I’m happy to be wrong... trust me I want to be wrong.

Limiting a processor to this just looks like a bad idea.

I can see a processor optimized for very low level code, and you can optimize and compile for that.

High level code? Trust me, I adore Rene Rictchie, but it doesn’t make sense for a fixed function thing.

If apple were going to do anything like this, they would implement the LLVM in hardware (or a close relative of it). That way it would at least be language independent and *sort of* something that corresponds to real CPU designs. But it would still be a bad idea.
 

Kostask

macrumors regular
Jul 4, 2020
230
104
Calgary, Alberta, Canada
That‘s a terrible idea from a performance perspective. That’s like going from RISC to CISC, then deciding that you haven’t made things inefficient enough, so you go an extra 300 yards along the highway of bad ISA architecture design.

Even UltraJava didn’t really run java as its ISA.

Try to understand what I said above: THIS IS NOT MY IDEA, NOR AM I PROMOTING IT. I am just passing along information that I have heard. It is probably the reason that it is considered an advanced project, not something pending. Efficiency has very little to do with anything these days, most computers spend over 95% in idle states waiting for user input; in addition, performance that may be so limited today won't be in 10 years. Can you just imagine what CPUs will be like 10 years from now, especially the ML/AI cores? Of course, Intel will still be on their 14nm+++++++++++++++++++++++++++++++++ process, but I am talking about AMD and Apple's SoCs.
 
Last edited:

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
I think you mean 14nm∞...! ;^p

Intel really should just cut a deal with Disney & get Tim Allen to do a "To Infinity & Beyond!" thing for them...
 

Waragainstsleep

macrumors 6502a
Oct 15, 2003
612
221
UK
My opinion only:
They want the Mac to be almost as much of a "walled garden" as is the iPad.
(which is why I don't use any iOS products)...

Is it your opinion though? Most people either extrapolate it from general paranoid libertarian principles or pick it up fifth hand from Android fanboys.

Uhhh...Apple has been ignoring the Professional community for years. Yes they have released a couple systems to not completely abandon them but they are clearly more focused on Phones, tablets, and laptops which get refreshed yearly, vs professional computers like the Mac Pro line which took 6 years to see an upgrade. Maybe that changes with the move to Silicon but if recent history is any indication I wouldn't bet on it.

It does actually.

Firstly, Apple's fleet of devices is rapidly acquiring enviable levels of built in security. It should be any CEO's wet dream for a large company. Apple has gotten into such companies with iPhones and iPads, now they need in with Macs. When they have the best performance, best battery life and best security, they start to become quite compelling. They can also come down on price better without having to pay Intel $100+ per box. (Not saying they will for consumers, though I do expect some movement downwards, but if you're ordering in the 100s there will be much better leeway.)

Apple is also beginning to shift some focus into services. They acquired Fleetsmith which should just add value to their corporate and edu offerings. I can't say creative pros will be a high priority because people who need substantially more than to edit 3 simultaneous streams of 4K video (like an iPad can) are becoming an increasingly niche market.


As far as eGPUs, we can say with a lot of certainty the current eGPUs that are built around Intel chipsets won't work as is.

Its my understanding that as long as there is Thunderbolt (which Apple have said there will be), the limiting factor is drivers. Nvidia claims this has kept them out so far (I'm not convinced tbh) so the same could easily happen to AMD.

I wonder if Apple will start to offer more options like the Afterburner cards the do certain jobs very well in order to bridge any gap between their on board graphics and the giant PCIE beasts of Mac Pros gone by.


I guess there is another best best case scenario in that silicon is so powerful and awesome eGPUs aren't needed.

I wouldn't rule this out.
 

Polly Mercocet

macrumors 6502
Aug 17, 2020
258
290
LDN
Apple's fleet of devices is rapidly acquiring enviable levels of built in security.

This used to be the case but the competition has caught up, at least in the mobile space. If you look at the market value for exploits, an Android FCP* is worth more than an iOS FCP, and there's enough iOS zero days out there that there's been talk of iOS exploits decreasing in value.

The easiest source to give for this is Zerodium since they have a public price list for various exploits and platforms. Currently an iOS FCP is $2m while an Android FCP is $2.5m and they've said they're getting so many iOS exploits they aren't even accepting some types anymore.

I really hope this changes in iOS 14. Apple has focused on squashing bugs and exploits, generally speaking, come from buggy code. But as it stands now Apple's security isn't as rock solid as you might think.

And macOS certainly is not as secure as iOS. Again hopefully this changes in future releases. If Apple cares about the Mac it will.

* Full chain (remote code execution, privilege escalation, etc) with persistence
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
Microsoft is getting less and less relevant in the consumer market with each year. They still have a strong foothold in the enterprise market, because businesses don't like change and have already dropped a lot of cash on software and support contracts and what have you, but consumers switch at the drop of a hat and Windows is a dying product in the consumer market.

Your argument glosses over why the market is doing what it’s doing. It’s splitting into two segments: One where tablets and smartphones can replace a full-fat laptop/desktop, and one where it can’t. But even then, for folks like me who need a laptop/desktop to do things that tablets currently can’t, we are buying them less frequently, which just makes sales numbers look even worse.

Since the post I was responding to was talking about laptop/desktop users, that first segment is irrelevant for the discussion. So it doesn’t matter if Microsoft is getting more or less relevant to the wider market, just to the market of users who want/need a laptop/desktop machine (i.e. what we currently think of as the x86 market).

But either way, sales isn’t the thing to be paying attention to here, but rather user base. What is the user base doing? What would it take to move them off x86/x64 and onto ARM or some other architecture? And the reality is that Microsoft enjoys a very large share of the active user base. They have 1 billion active users on Windows 10. PCs and consoles have about equal share of the gaming community user base, and on PC, Windows has the lion‘s share of gamers. Content creation has yet to be turned into a web app or brought to Android/iOS tablets in any real capacity beyond being a companion for a desktop app.

If desktops want to go ARM or some other architecture, the tools/platforms people are using have to enable that. That’s the main point I’m getting at. And it‘s clear people care about that if you look at all the gnashing that happens every time Apple changes ISA about compatibility, and will Adobe support the new thing, etc, etc, etc. As long as Microsoft continues their “toe in the water” approach to ARM and other architectures, companies that make the tools people are using their Windows machines for won‘t come along for the ride. Apple doesn’t have the user share in the desktop market to force the issue alone either, nor does the ensemble of Linux distributions. As long as Microsoft continues to enjoy a 70+% user base share of desktops, any revolution will be entrenched in a corner of the desktop market.

Microsoft is smart enough to know this, which is why they tried so desperately to get into the smartphone and tablet markets, as that's where the consumers are heading. They failed to make a drop in the smartphone market even when they did an exclusive deal with Nokia and even later bought out Nokia's whole smartphone division for a few years. They practically couldn't give Windows Phones away. They failed to make a drop in the tablet market because everyone hated Windows 8, and while Windows 10 fixed all the problems of 8 for the desktop, it still makes for a poor tablet experience.

Believe me, I know full well the issues Microsoft has had in this space. I have a lot of experience in the Windows Phone space, and got to watch it all happen real time, and facepalm at the decisions being made, and at the lack of progress. That said, Windows 8 was a poor imitation of Windows Phone, in my view, and people were right to hate on it.

The worst part of it is that Microsoft didn’t need to “get into” the smartphone market, it practically helped invent the thing back around 2003 or so. It just utterly failed to properly invest in it and see where the puck was going, just like Blackberry and Nokia when Google and Apple came in and turned the business smartphone into what we see today.

Try to understand what I said above: THIS IS NOT MY IDEA, NOR AM I PROMOTING IT. I am just passing along information that I have heard. It is probably the reason that it is considered an advanced project, not something pending. Efficiency has very little to do with anything these days, most computers spend over 95% in idle states waiting for user input; in addition, performance that may be so limited today won't be in 10 years. Can you just imagine what CPUs will be like 10 years from now, especially the ML/AI cores? Of course, Intel will still be on their 14nm+++++++++++++++++++++++++++++++++ process, but I am talking about AMD and Apple's SoCs.

The problem is that the idea has no merit, so of course people are going to pile onto it. It’s a sort of “Cold Fusion” idea. It sounds nice, but doesn’t actually have any meat on the bones. It’s not about efficiency, it’s about the fact that compilation hasn’t been well suited to fixed-function hardware. There hasn’t even been much progress in GPGPU-style acceleration of compiling software. Going full ASIC or even FPGA on this is not a small engineering task in the least. Worse, it would mean that you couldn’t adopt new language features without new hardware. That’s throwing away a huge advantage that programming languages currently enjoy.

As cmaier mentions, using LLVM bytecode as an ISA would make more sense. But even then, there’s not much benefit to be had. I wouldn’t be surprised if someone like Apple considered LLVM bytecode as an ISA at some point, but it’d still be mostly complicating things for the sake of complicating things. I do think there’s been some actual research into this space too, so I wonder if this is just a game of telephone.

I do know folks associated with Apple have poked around using Swift for a Mac kernel (there’s even a GitHub project for the prototype), and honestly, that sort of thing is more promising. Having done some hardware driver level work in Swift in the past, it’s got a lot of promise for this space, unlike say, the rumored Microsoft project to use .NET for the kernel.
 

Yebubbleman

macrumors 603
May 20, 2010
6,024
2,616
Los Angeles, CA
No idea why they still run the 8th gen Intel on these things. Dell and everyone SMALLER can update yearly with new products but Apple is two years old on the cpu. Half of me thinks they’re going ARM to eliminate comparisons. Every Gen is the latest even if it’s a couple years old.

The fact that Apple won't have people comparing its MacBook Pros and iMacs to PC equivalents will surely be nice for Apple (as that helped the notion of "Pentium Crushing Power" two decades ago).

That said, the answer to the question of "why are they still using 8th Gen" that I most recently heard, albeit in specific reference to the 8th Gen-based MacBook Pro (13-inch, 2020, Two Thunderbolt Ports) was that the cost for a 10th Gen equivalent part (since there wasn't really ever a 9th Gen equivalent part) would've cost more money than it was worth, the performance gains weren't substantial enough (albeit for the parts specific to the two-port 13" MacBook Pro), and Apple just didn't want to bother with it (very likely because this transition was around the corner).

That is surely why the Mac mini didn't get the update to 10th Generation chips this year and ESPECIALLY why the 21.5" iMac didn't get any internal update alongside the 27" iMac, which did. (Apple has never updated only one iMac [save for education specific models] at any point during the Intel Mac era like this.)

As for your post title, no, Apple isn't going to Apple Silicon "just to be different". They're doing it because Intel has been a lousy partner since 2015 and because Apple is making advances in their own SoCs exponentially faster than Intel is with their x86 processors. As Tim Cook said, they're doing it "to make better products". AMD might've been able to save the x86 ship (on Macs) from sinking, but Apple's performance gains down the road are likely to be way faster even than the Ryzen's best and brightest x86 future. Plus, there really is a huge benefit to them (and ultimately to developers) to have all of their platforms running on the same processor architecture.

I am sure they can make a Silicon Mac that can surf the internet and check email. Can they make a Silicon Mac that can render 4K video as well as my Mac mini with an eGPU? I'll believe it when I see it.

Did you not see the WWDC2020 demo where they showed the DTK running multiple streams of 4K simultaneously? That's on an A12Z with no eGPU. We are surely seeing something more powerful launch on an Apple Silicon 13" MacBook Pro that can outperform the A12Z this year.

Three years ago Apple went out of their way to make the iMac Pro because professional users were starting to leave the Mac ecosystem when Apple dropped the ball. Apple just released the Mac Pro last year as a further commitment to professional users. It might not be until the end of the 2 year transition period but Apple are absolutely going ti have Macs powerful enough for the high end professional users.

Also, we have no reason to believe that third party GPUs can’t still be used with ARM Macs.

Watch the WWDC video on the Apple Silicon Mac architecture; they outright stated that the GPUs will be integrated into the SoC. That doesn't necessarily preclude third party GPUs and/or eGPUs from being added to something like the Mac Pro, but it makes it way less likely that we'll see something on anything that isn't a Mac Pro.

Are you talking about Mac Mini? Apple doesn’t update it too often, it’s a basic office computer and on lower update frequency. All other product lines use up to date CPUs.

Uhh...have you forgotten about the 2020 Intel based two-port 13" MacBook Pro? That thing may have had scissor switch keyboards and a dedicated escape key return to it, but there is still an 8th Gen CPU under that hood.

Same for the 21.5" iMac which is also rocking 8th Gen (unmodified in that respect since Spring 2019). So, that's THREE Intel Macs that are still running 8th Gen despite minor refreshes this year. And that's not insignificant seeing as those are all entry-level Macs.

Make no mistake: I don't like this transition. I came into the Mac world shortly after they transitioned to x86, and I am application-driven. If the programs I use most heavily come over to ARM then I'll be along for the ride; if not, I'll have to seriously consider going back to Windows, which will also break some of the "magic" of having other products in Apple's ecosystem. But then, Apple probably knows that this seems less likely to happen, too. My desktop used to be the #1 most-used electronic device for me, and everything else was secondary. These days, largely because I am busier with other areas of my life, the portable iOS devices get more overall screen time than my computer. The computer is still needed to "heavy lifting" applications, and overall I find it easier to work on, but it's no longer my central electronic device.

Honestly, the Mac is the one Apple product that one needs the least in terms of said "magic" of being in the Apple ecosystem. iCloud for Windows covers the absolute necessities. Otherwise, you can absolutely substitute a Mac for a Windows 10 machine, but still otherwise enjoy everything else being Apple with nary an issue. Hell, I'm pretty much making that move right now.

x86 will live in Datacenters and Cloud the same way old IBM systems are seen today.

It requires repeating, lets stop the arguments of RISC over CISC. Lets stop the arguments of x86 vs ARM. It requires repeating, Intel saw x86 as a big enough liability that they created Itanium, that everyone else called the Itanic. Everyone knows about the legacy issues.

Let the big businesses figure out their data centers, as everyone is moving to the Cloud. It's not your problem.

If we can get predictable, stable, and consistent chipsets that work, that's a HUGE win. We could argue about this stuff all day. What we should be arguing about is which platform after ARM that Apple will move us to. Because they run the entire stack in two years, the transition should be much cleaner.

x86 will remain for Windows and Windows will remain for 85% of the market. That much will not change. For the former to change, Microsoft needs to evangelize Windows 10 for ARM64 and do a good enough job at doing so to convince developers that they ought to be producing ARM64 compatible binaries alongside (if not in place of) x86-64/x64/AMD64 binaries. Apple Silicon Macs could be a unique opportunity for them to be able to do just that. But, until then, x86 isn't going anywhere on the Windows side of things.

And while yes, the PC market has been in an overall decline (with Apple managing better than PC manufacturers), most of that decline is for home user PCs. The enterprise is where it really counts and most PC manufacturers in that market are thriving just fine.

Not true.

All of this is another good reason to make the move to Apple Silicon.

View attachment 945465

It's a moot point as Apple Silicon Macs will outperform their direct Intel Mac predecessors. Certainly, any Mac that has an 8th Generation Intel Core i chip inside will have its Apple Silicon replacement outperform it.
 

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
The fact that Apple won't have people comparing its MacBook Pros and iMacs to PC equivalents will surely be nice for Apple (as that helped the notion of "Pentium Crushing Power" two decades ago).

That said, the answer to the question of "why are they still using 8th Gen" that I most recently heard, albeit in specific reference to the 8th Gen-based MacBook Pro (13-inch, 2020, Two Thunderbolt Ports) was that the cost for a 10th Gen equivalent part (since there wasn't really ever a 9th Gen equivalent part) would've cost more money than it was worth, the performance gains weren't substantial enough (albeit for the parts specific to the two-port 13" MacBook Pro), and Apple just didn't want to bother with it (very likely because this transition was around the corner).

That is surely why the Mac mini didn't get the update to 10th Generation chips this year and ESPECIALLY why the 21.5" iMac didn't get any internal update alongside the 27" iMac, which did. (Apple has never updated only one iMac [save for education specific models] at any point during the Intel Mac era like this.)

As for your post title, no, Apple isn't going to Apple Silicon "just to be different". They're doing it because Intel has been a lousy partner since 2015 and because Apple is making advances in their own SoCs exponentially faster than Intel is with their x86 processors. As Tim Cook said, they're doing it "to make better products". AMD might've been able to save the x86 ship (on Macs) from sinking, but Apple's performance gains down the road are likely to be way faster even than the Ryzen's best and brightest x86 future. Plus, there really is a huge benefit to them (and ultimately to developers) to have all of their platforms running on the same processor architecture.



Did you not see the WWDC2020 demo where they showed the DTK running multiple streams of 4K simultaneously? That's on an A12Z with no eGPU. We are surely seeing something more powerful launch on an Apple Silicon 13" MacBook Pro that can outperform the A12Z this year.



Watch the WWDC video on the Apple Silicon Mac architecture; they outright stated that the GPUs will be integrated into the SoC. That doesn't necessarily preclude third party GPUs and/or eGPUs from being added to something like the Mac Pro, but it makes it way less likely that we'll see something on anything that isn't a Mac Pro.



Uhh...have you forgotten about the 2020 Intel based two-port 13" MacBook Pro? That thing may have had scissor switch keyboards and a dedicated escape key return to it, but there is still an 8th Gen CPU under that hood.

Same for the 21.5" iMac which is also rocking 8th Gen (unmodified in that respect since Spring 2019). So, that's THREE Intel Macs that are still running 8th Gen despite minor refreshes this year. And that's not insignificant seeing as those are all entry-level Macs.



Honestly, the Mac is the one Apple product that one needs the least in terms of said "magic" of being in the Apple ecosystem. iCloud for Windows covers the absolute necessities. Otherwise, you can absolutely substitute a Mac for a Windows 10 machine, but still otherwise enjoy everything else being Apple with nary an issue. Hell, I'm pretty much making that move right now.



x86 will remain for Windows and Windows will remain for 85% of the market. That much will not change. For the former to change, Microsoft needs to evangelize Windows 10 for ARM64 and do a good enough job at doing so to convince developers that they ought to be producing ARM64 compatible binaries alongside (if not in place of) x86-64/x64/AMD64 binaries. Apple Silicon Macs could be a unique opportunity for them to be able to do just that. But, until then, x86 isn't going anywhere on the Windows side of things.

And while yes, the PC market has been in an overall decline (with Apple managing better than PC manufacturers), most of that decline is for home user PCs. The enterprise is where it really counts and most PC manufacturers in that market are thriving just fine.



It's a moot point as Apple Silicon Macs will outperform their direct Intel Mac predecessors. Certainly, any Mac that has an 8th Generation Intel Core i chip inside will have its Apple Silicon replacement outperform it.

So lets hit on a few of these things here. First, Microsoft is clearly focused on getting Windows on ARM to be a thing. The Surface Neo is a very cute product. It’s got 4 Atom Cores and a single Core series cores. It’s cute. Theres no way this is going to perform well next to an entry level iPad especially once they get A14s.

I have no doubt that Microsoft is focusing largely on working with Qualcomm chips to get them up to snuff. I have no doubt they are keenly focused on that to where it needs to be, for no other reason than they need they can’t get the performance they need in form factors they need to be competitive using Intel. It’s a non-starter.

I’m saying that as someone that has stopped my self no fewer than 3 times from ordering a Surface Pro X for work.

None of this is going to happen overnight. Period. Just not happening. But, Microsoft can continue to do what they’ve done with Surface product line. Use that as a way to stimulate growth on areas that other companies struggle to do.

The other key thing to keep in mind is NVidia looking to buy ARM. NVidia isn’t looking to buy ARM to continue making chips for smart cars and Nintendo Switch. I’m sure that helps, but NVidia has wanted for YEARS to be Intel on notice. I hope NVidia does buy Intel, and in doing so causes everyone at Intel to realize they are put on notice.

I expect Microsoft to look at NVidia as a company they can also partner with to get more diverse SOCs out in the market place. I expect, and want the combined combination to be a force to be reckoned with, and keep Apple in check.

Sure, WinTel will probably always be around. But to what end? Sure, they’ll release the “cute” Surface Neo. It will sell well, I’m sure, but the reason it’s using that “cute” processor is because A) Microsoft is still beholden to legacy workloads B) x64 to ARM64 isn’t working yet C) Microsoft knows they can’t release an aspirational product like the Surface Neo without being able to load whatever legacy application not with emulation yet.

Apple is all in on ARM, AWS is ramping up use of ARM workloads, NVidia is looking to buy ARM, Microsoft wants to advance the ARM agenda. Period.

Is this going to happen overnight? Hell no. But make no mistake, this year we heard the first drop to this storm. Next year we’ll have some sprinkles, the following year the clouds, thunder in the distance, and a downpour. The following year, you’ll wonder if it’s too late to get your plants in.

Maybe I’m wrong on timing. Maybe it’s shorter or longer. But the storm is happening.

Sure, Intel will always be around, but how long will it be when companies treat Intel the way they treat IBM? I know when I go on site to see a customer that has IBM that they are normally in a world of hurt.
 
  • Like
Reactions: Boil

Kostask

macrumors regular
Jul 4, 2020
230
104
Calgary, Alberta, Canada
...............

The problem is that the idea has no merit, so of course people are going to pile onto it. It’s a sort of “Cold Fusion” idea. It sounds nice, but doesn’t actually have any meat on the bones. It’s not about efficiency, it’s about the fact that compilation hasn’t been well suited to fixed-function hardware. There hasn’t even been much progress in GPGPU-style acceleration of compiling software. Going full ASIC or even FPGA on this is not a small engineering task in the least. Worse, it would mean that you couldn’t adopt new language features without new hardware. That’s throwing away a huge advantage that programming languages currently enjoy.

As cmaier mentions, using LLVM bytecode as an ISA would make more sense. But even then, there’s not much benefit to be had. I wouldn’t be surprised if someone like Apple considered LLVM bytecode as an ISA at some point, but it’d still be mostly complicating things for the sake of complicating things. I do think there’s been some actual research into this space too, so I wonder if this is just a game of telephone.

I do know folks associated with Apple have poked around using Swift for a Mac kernel (there’s even a GitHub project for the prototype), and honestly, that sort of thing is more promising. Having done some hardware driver level work in Swift in the past, it’s got a lot of promise for this space, unlike say, the rumored Microsoft project to use .NET for the kernel.

I am sure that whoever is running this project in Apple Advanced Projects would be happy to hear your thoughts on this. However, at this point, you are shooting the messenger, and the messenger doesn't really have anything to do with the message, he is just passing it on. He has no way of knowing whether the message is realistic, gibberish, or totally true.

However, I would like to point out that functions and even internal operations can be updated even on today's CPUs. There have been micro-instruction updates to Intel CPUs for things like Spectre and Meltdown mitigations that actually change the internal operation of the CPU. Extending that slightly may make adding new Swift language extensions possible.
 
Last edited:

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Try to understand what I said above: THIS IS NOT MY IDEA, NOR AM I PROMOTING IT. I am just passing along information that I have heard. It is probably the reason that it is considered an advanced project, not something pending. Efficiency has very little to do with anything these days, most computers spend over 95% in idle states waiting for user input; in addition, performance that may be so limited today won't be in 10 years. Can you just imagine what CPUs will be like 10 years from now, especially the ML/AI cores? Of course, Intel will still be on their 14nm+++++++++++++++++++++++++++++++++ process, but I am talking about AMD and Apple's SoCs.
As a cpu designer starting in the 1990s, i can definitely imagine what cpus will be like in 10 years.

And it won’t look like this.
 
  • Like
Reactions: chabig

Kostask

macrumors regular
Jul 4, 2020
230
104
Calgary, Alberta, Canada
How about like this:


Still think this is out of reach? Note the stated timeline in the video. Note the stated timeline for the Swift ISA SoC. And to preempt a lot of "can't be done", "not possible" etc. stuff, like I am getting right now, even though this is not my idea or thought, look at his other videos first. Also note who he has a conversation with.

As a CPU designer in the 1990s, did you see the coming of ML/AI cores? Did you see CPUs 100X-1000X more powerful than what you had running on less than 10W?
 

the8thark

macrumors 601
Apr 18, 2011
4,628
1,735
As a CPU designer in the 1990s, did you see the coming of ML/AI cores? Did you see CPUs 100X-1000X more powerful than what you had running on less than 10W?
Did you see the future improving? Of cause. We all did. We just had zero idea what that future would be. Just like now, qw hve zero idea of what the future in 30 years will be.
 

Kostask

macrumors regular
Jul 4, 2020
230
104
Calgary, Alberta, Canada
So how come people are saying that a CPU with a Swift ISA is impossible? If Apple thinks they can do it, more power to them. Even if they can't, they may learn some things along the way that can be applied elsewhere. I applaud the fact that they are willing to a) think in new creative ways (I'd like to say outside the box, but I don't think Apple has any idea of what the box is about), and b) willing to invest in projects that may never result in solid projects, but they invest anyway. This is the only way forward.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
Uhh...have you forgotten about the 2020 Intel based two-port 13" MacBook Pro? That thing may have had scissor switch keyboards and a dedicated escape key return to it, but there is still an 8th Gen CPU under that hood.

Yeah, but Intel doesn’t offer any “9-gen“ 15W CPUs. And the 15W and 30W Ice Lake are fairly close together in performance, so if Apple were to upgrade the lower end MBP to 9th gen the higher tier would become fairly pointless. There is a similar effect with Comet Lake CPUs - on high end desktop they offer more CPU cores, which makes them a reasonable (if disappointing) upgrade, but in the lower end (and mobile), they are virtually identical to the predecessors.

To make it clear: I am not justifying Apple’s behavior here, just arguing that Intel’s asymmetric product portfolio does not align very well with Apple imagined product palette. of course, Apple could just do like everyone else and fit their products to what Intel offers, but it’s Apple we are talking about :)
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
So how come people are saying that a CPU with a Swift ISA is impossible? If Apple thinks they can do it, more power to them. Even if they can't, they may learn some things along the way that can be applied elsewhere. I applaud the fact that they are willing to a) think in new creative ways (I'd like to say outside the box, but I don't think Apple has any idea of what the box is about), and b) willing to invest in projects that may never result in solid projects, but they invest anyway. This is the only way forward.

I am not a chip designer and I can’t predict the capability of future CPUs. But I do know some things about designing and implementing programming languages and theoretical foundation of computation. I see no way of efficiently executing a high level programming language on any currently known practically realizable computing architecture. The semantics of high level languages simply are not a good fit to how the hardware operates. You need multiple transformation passes to lower the abstract concepts present in high-level languages to something that hardware runs well. I can certainly imagine hardware that will accelerate some things frequently used in Swift or other languages (like reference counting).

But as you say, new architectures and approaches will emerge, and these might completely change how we have been programming for the last 80 years. But we certainly won’t be using Swift then :)
 
  • Like
Reactions: chabig and cmaier

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
So how come people are saying that a CPU with a Swift ISA is impossible? If Apple thinks they can do it, more power to them. Even if they can't, they may learn some things along the way that can be applied elsewhere. I applaud the fact that they are willing to a) think in new creative ways (I'd like to say outside the box, but I don't think Apple has any idea of what the box is about), and b) willing to invest in projects that may never result in solid projects, but they invest anyway. This is the only way forward.

I know I've said a lot on this, but I do want to be clear I'm not trying to neg anyone at all. For me personally, I just don't see the point. High level languages are incredibly fluid, and there's not very much you can do once a chip is nabbed to get around that.

I am more than happy to be wrong, trust me. I don't see how it would add any value either.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.