Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Arm chips will supplement Intel CPU's, becoming active for low-demand activities while saving power. Intel CPU will kick in for performance when required.

So there will be a switcheroonie between Intel and ARM cpus? How's that gonna work then? At the very least you have two different architectures running in parallel. Sounds like a technical non-starter. Not to mention requiring a doubling of the OS footprint for two OS X variations.
 
  • Like
Reactions: Cape Dave
So there will be a switcheroonie between Intel and ARM cpus? How's that gonna work then? At the very least you have two different architectures running in parallel. Sounds like a technical non-starter. Not to mention requiring a doubling of the OS footprint for two OS X variations.
Footprint may be an issue, but technically I'd consider it similar to switching between integrated and dedicated GPU, which now is completely smooth on OS X (after some teething problems).
 
These are two different problems (GPU switching and CPU switching).
Why? It's also two different architectures. Yes I'm aware of the context switching and cache issues (anyone remember the PPC boards for the Amiga r.i.p.?).

But it's more than 20 years later now and with things like e.g. Grand Central Dispatch, Apple has very powerful tools available in the system to help here.
 
Why? It's also two different architectures. Yes I'm aware of the context switching and cache issues (anyone remember the PPC boards for the Amiga r.i.p.?).

But it's more than 20 years later now and with things like e.g. Grand Central Dispatch, Apple has very powerful tools available in the system to help here.

Because you're running two different kernels each of which would require device drivers to be able to suspend/resume and sync somehow when the context switch between CPUs occurs. This doesn't even discuss the kernel running on Intel running apps suddenly needing to be an ARM kernel running ARM code for those apps.
 
Because you're running two different kernels each of which would require device drivers to be able to suspend/resume and sync somehow when the context switch between CPUs occurs. This doesn't even discuss the kernel running on Intel running apps suddenly needing to be an ARM kernel running ARM code for those apps.
I don't think they'd need to switch during executing an app. An Intel app/thread stays on Intel, an ARM app/thread stays on ARM. Both share the resources and cycle times, e.g. via Round-Robin. The rest is similar to switching between dGPU and iGPU (both also need their individual drivers etc.).

Perhaps even similar to the CoProcessor collaboration in the old Amiga (with shared resource access, e.g. ports and Ram), only that the available system tools back then were way less sophisticated and powerful. But they managed that co-existence pretty well already.

Surely still not trivial, but imho also no rocket science these days. And the general experience with core switching in recent iOS devices surely doesn't hurt either.
 
I don't think they'd need to switch during executing an app. An Intel app/thread stays on Intel, an ARM app/thread stays on ARM. Both share the resources and cycle times, e.g. via Round-Robin. The rest is similar to switching between dGPU and iGPU (both also need their individual drivers etc.).

Perhaps even similar to the CoProcessor collaboration in the old Amiga (with shared resource access, e.g. ports and Ram), only that the available system tools back then were way less sophisticated and powerful. But they managed that co-existence pretty well already.

Surely still not trivial, but imho also no rocket science these days. And the general experience with core switching in recent iOS devices surely doesn't hurt either.

I think you're way off in left field. The Amiga copper was a part of the Angus chip which executed instructions (copper lists) based on the location of the video beam of the display.

A coprocessor is entirely the opposite of what you were originally proposing. Your original assertion was for Intel or the ARM chip to be in low power mode when the other is executing.
 
I think you're way off in left field. The Amiga copper was a part of the Angus chip which executed instructions (copper lists) based on the location of the video beam of the display.
The Amigas coprocessors were Agnus, Denise and Paula. Paula did peripheral and audio, Denise video and Agnus several other tasks. I was not thinking solely of the copper, but of the whole system. The mentioned chips had to synchronize with each other and the CPU, e.g. for access to Ram, ports etc.

While Denise was working on displaying the screen, it may have had to synchronize with Agnus for possible copper list instructions. But the coprocessors did basically work independently from each other (which is what contributed big time to Amigas magic) and despite their very different silicon structures, they could work together, just like an Intel and an ARM.

In a hypothetical Intel/ARM combination, the ARM chips could e.g. execute low-demand tasks, such as system background maintenance, user input etc. If the user started a big/demanding program, such as e.g. Video cutting, that thread could be executed by the Intel chip, while the ARM would still take care of the non-demanding tasks its already dealing with.

Once the demanding program ends, the Intel chip could go back to deep sleep, while the ARM would carry on, while using comparably less power. Apple does something like that already for several years now: The m-coprocessor chips in the iPhones are meant to prevent waking the main CPU for low-demanding tasks.

Once the whole system is migrated to ARM (talking about years here), a high-power ARM variant could then do the heavy lifting (perhaps in combination with a low-power core for the low-demand tasks as drafted above), while a medium-class Intel could still be available for compatibility reasons.

Thus the idea of having it as an option in the high-end Macs: At that (hypothetical) point in time, not everyone would need an Intel CPU (just like not every user needing a discrete GPU today). Yes, there would be discussions, just as there have been when Apple removed the dGPU option from an increasing number of machines. But eventually most people would be okay with the changes.

A coprocessor is entirely the opposite of what you were originally proposing. Your original assertion was for Intel or the ARM chip to be in low power mode when the other is executing.
I was thinking more of the synchronization mechanisms in that coprocessor system. When both coprocessors want to access the same resource at the same time, they have to agree, which one is first and which one has to wait. Due to the limitation of 80's technology, however, even the waiting chip would continue to run full power (which was no problem, as the Amiga was a desktop machine with "unlimited" energy available). But with the technological improvements I mentioned, in 2018 the waiting chip would go into sleep mode while waiting.

And with a sophisticated system, the waiting phases could be adjusted dynamically: For example, if the demanding program running on Intel is waiting for a user interaction, it could go to sleep until the ARM chip has dealt with the user input, such as e.g. typing in a name for a file to be saved. Or until the disk I/O is done. Or until the user finished moving the mouse and some intense calculations have to start.

As human users tend to be slow in comparison, the waiting phase for the Intel could be pretty long (a couple of seconds are very long already for modern silicon) and all the small waiting phases would eventually add up and lead to less energy consumption and less heat generation. The former is interesting primarily for mobile systems, the latter also for anemic desktops like the iMac or a potential hockey-puck-sized mini.
 
Building a hybrid ARM/Intel machine is a waste of time.

Many applications are transitioning to web based, iOS has a huge ARM software library that would be easily ported.

If you want to run PC apps.... well.... the market for that is shrinking... and building a mac to run PC apps is not going to get people off the PC.

I get it, i’m one of the people who runs/ran VMs on a Mac, but really... it’s not required.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.