Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Internet connections get faster over time. Anyone who needs to rent Apple Silicon in the cloud would have good internet.

Today, you can stream AAA games from the cloud, use Microsoft Windows hosted in the cloud, and even accelerate Google Chrome from the cloud. Just imagine in 5 years.

Apple isn't going to ignore all the advantages of the cloud. And when they want to offer more cloud services like the above, they aren't going to use Intel/AMD chips. They're going to use Apple Silicon. Hence, it's not a matter of if, but when Apple Silicon gets put in the cloud.
I’m aware of all of that, but a lot of those cloud services are horrible with mediocre internet - allegedly 42 million Americans don’t even have access to broadband internet. Of course it will get better with time, but that’s my point. Until it does, I don’t see it being super widespread for consumers.
 
It's quite clear that more and more tasks that used to be done locally will move to the cloud. Apple can't escape this trend.

Apple STARTED this trend with iDisk instead of shipping floppy drives.

I'm sure someone can link the video of Steve Jobs being asked about networking in the 1990s, where he basically went off on a tangent describing how he has a t3 at home and just has someone else look after it, and that was the future.

He was basically describing cloud computing 15-20 years before people started selling/using it.
 
Apple STARTED this trend with iDisk instead of shipping floppy drives.

I'm sure someone can link the video of Steve Jobs being asked about networking in the 1990s, where he basically went off on a tangent describing how he has a t3 at home and just has someone else look after it, and that was the future.

He was basically describing cloud computing 15-20 years before people started selling/using it.
That's so long ago, and the people in charge are different. So I hope that Apple doesn't fall too behind when it comes to taking advantage of cloud compute.
 
I’m aware of all of that, but a lot of those cloud services are horrible with mediocre internet - allegedly 42 million Americans don’t even have access to broadband internet. Of course it will get better with time, but that’s my point. Until it does, I don’t see it being super widespread for consumers.
Of those 42 million Americans, how many do you think will rent a powerful Apple Silicon in the cloud?

The point is, Apple can launch the service and get demand today. Plenty of people have internet good enough. And then as more people get faster internet, the market grows larger.
 
The point is, Apple can launch the service and get demand today.
A more important point is - what is the demand for "Mac in the cloud"?

Obviously there is some demand - several firms run racks of Mac Minis for hire - but the question is whether there is a big enough demand to get Apple out of bed.

The big selling points of Macs are their UI and sleek-looking hardware which are irrelevant to a cloud "backend" server.

Going forward, the de-facto standard seems to be a backend running on Linux (which even Microsoft are now embracing) - often using open-source tools - and a client-side frontend running in a browser, or client-specific App. There's no particular appeal to making your backend Mac-specific (or Windows-specific), when a Linux version lets you shop around all of the cloud providers for the best hosting deal (...and, since MacOS is Unix, would probably run on Mac anyway, with minor tweaks).

If we're talking about "virtual desktop PCs" running a GUI and accessed via something like virtual desktop - there's a particular demand for that in the Windows world given Windows' dependence on its ability to run 20 year-old legacy software, and its huge presence in the corporate sector (...essentially, custom database frontends where the data is already in the cloud). I can see that becoming immensely popular for home working (avoids employees having company data stored in their homes, and centralises all of the regulatory compliance box-ticking) and being the best long-term solution for Mac users who need to run Windows because of that one weird app that they need for work.

Not sure that the case for a "Virtual Mac" - verses proper client/server applications - is so strong - (a) Mac isn't big on running legacy code, with mass software extinctions every 5-10 years (Classic, PPC, Carbon, x86-32, to be followed by x86-64 in a few years, and lots of smaller extinctions due to security model changes etc...) - (b) "Pro" Mac users are biassed towards media production which means frequent access to huge local files (fancy uploading all your .raw files to the cloud before you start editing?) and dedicated local hardware - sure, there are some (collaborative) scenarios where media-production-in-the cloud makes sense, but that's not necessarily served by running FCP on a VM via Remote Desktop. Finally (c) I have a little picture of Apple letting people run MacOS-in-the-cloud via a RDP client on their Windows PC, Android tablet, Chromebook etc. Not. Apple could restrict it to clients running on Apple devices but, really, that defeats a lot of the object of having Mac-in-the-cloud.

Also, so far, we just have the M1 which is a system-on-a-chip designed for ultraportables and tablets. Sure, short-term it thrashes Intel's mobile offerings and is tempting "pro" Mac users, but it is not really a pro workstation or server chip. There's a reason why server hardware runs on Intel Xeon or AMD Epyc rather than laptop/desktop chips. There are ARM-based server-optimised chips out there - and you can get ARM servers in the cloud, but they're not Apple Silicon and won't run MacOS. M1X still has to prove itself, and still isn't really a server chip.

So, as I said, it's not just whether there is interest, it is whether there is enough interest to make it worth Apple's while developing server-grade Apple Silicon chips.
 
it's not just whether there is interest, it is whether there is enough interest to make it worth Apple's while developing server-grade Apple Silicon chips.
The chips for those Mac Pros will certainly capable for server workload. Most macOS server demand come from CI for
iOS/macOS development and now they are running on racks of Mac Minis. Apple's own CI, Xcode Cloud, is currently running on Intel Xeon, and if they want to drop Intel Chips altogether, they have to make their own chip at least powerful enough for their own CI solution.(Or they can just use huge amount of Mac Minis if they have to, but it will be much more expensive).
 
  • Like
Reactions: altaic
Apple's own CI, Xcode Cloud, is currently running on Intel Xeon, and if they want to drop Intel Chips altogether, they have to make their own chip at least powerful enough for their own CI solution.
Do they need to drop Intel chips for that? Are they even running it on Macs at the moment? Sure, if they were starting from scratch with an all-ARM ecosystem. it might make sense to use Apple Silicon, but all the infrastructure to cross-develop & test for iOS/WatchOS/tvOS/M1 MacOS using an x86 is well established, and with them still selling Intel Macs as we speak they're committed to maintain x86 support in MacOS for at least another 4-5 years - after which, I suspect, that the days of processor-dependent application code will be drawing to a close.
 
Of those 42 million Americans, how many do you think will rent a powerful Apple Silicon in the cloud?
None will, of course, it's not workable. Now if they had gigabit internet, not just a few would, businesses especially. It's crazy to think the cloud will take over everything in the next few decades. It will eventually, I agree, but I deal with now. I have unreliable comcast and that's all that's available to me except slow DSL. So no, I don't rely on cloud for anything. Where I work only has 75Mb, so no cloud stuff there either.

The point is, Apple can launch the service and get demand today. Plenty of people have internet good enough. And then as more people get faster internet, the market grows larger.
Why though -- cloud computing hasn't been really about speed of processing, but storage capability. If you need speed on the computing side, it would be idiotic to go off premises, latency is the king, and anything internet doesn't qualify. And besides, Apple really isn't ahead in speed when you compare them to other high performance systems, only power, but if you need speed, power is a distant secondary.

Not to mention the cost to convert your applications to whatever OS you'd be running on that Apple silicone. (which is the biggest hurdle to overcome for many businesses.)
 
  • Like
Reactions: Jorbanead
Why though -- cloud computing hasn't been really about speed of processing, but storage capability. If you need speed on the computing side, it would be idiotic to go off premises, latency is the king, and anything internet doesn't qualify.

Latency only matters for real-time/interactive work - as soon as a computation takes more than a fraction of a second, how many CPU cores you can throw at it, how fast the storage access is become more significant than any delay in sending the result back to you. Compiling a large project, doing a complex database query, or any bout of heavy number crunching can be sped up considerably by throwing more CPU cores, RAM and faster storage at the problem and don't rely on low latency to/from the user interface - the sort of thing that cloud computing makes more affordable and scalable.

If you want more processing speed in the cloud you rent more/faster CPU cores as and when you need them.
If your cloud server is mostly firing off database queries to other systems it probably doesn't need them - and your bog-standard cloud service is probably targeted at that sort of use - but supercomputing and GPU compute cloud services are available.

Meanwhile your server has its own super-fast pipe to the Internet for exchanging big chunks of data with other services, which can be far more significant than the relatively meagre traffic between your laptop and your server.

For something like development (a) it is getting to be a real pain without a decent internet connection for documentation & collaboration and (b) modern development involves an awful lot of fetching & updating code from online repositories, and there the relevant bandwidth is between the cloud server and the rest of the internet - that data doesn't have to come down your broadband link.

As I said in an earlier post, though, that might not be so true for the media production software that seems to be most Mac users definition of "pro" c.f. all the business data munging on Windows.
 
Latency only matters for real-time/interactive work - as soon as a computation takes more than a fraction of a second, how many CPU cores you can throw at it, how fast the storage access is become more significant than any delay in sending the result back to you. Compiling a large project, doing a complex database query, or any bout of heavy number crunching can be sped up considerably by throwing more CPU cores, RAM and faster storage at the problem and don't rely on low latency to/from the user interface - the sort of thing that cloud computing makes more affordable and scalable.

If you want more processing speed in the cloud you rent more/faster CPU cores as and when you need them.
If your cloud server is mostly firing off database queries to other systems it probably doesn't need them - and your bog-standard cloud service is probably targeted at that sort of use - but supercomputing and GPU compute cloud services are available.

Meanwhile your server has its own super-fast pipe to the Internet for exchanging big chunks of data with other services, which can be far more significant than the relatively meagre traffic between your laptop and your server.

For something like development (a) it is getting to be a real pain without a decent internet connection for documentation & collaboration and (b) modern development involves an awful lot of fetching & updating code from online repositories, and there the relevant bandwidth is between the cloud server and the rest of the internet - that data doesn't have to come down your broadband link.

As I said in an earlier post, though, that might not be so true for the media production software that seems to be most Mac users definition of "pro" c.f. all the business data munging on Windows.
It's just not workable around here, and we're not a media production company, we're a plain old business that happens to have database access needs, and since no decent internet, it's all on premises. User interaction is the most important thing anything here does..

It's like you're talking 30+ years down the road for me, not anything that makes sense any time soon.
 
Of those 42 million Americans, how many do you think will rent a powerful Apple Silicon in the cloud?

The point is, Apple can launch the service and get demand today. Plenty of people have internet good enough. And then as more people get faster internet, the market grows larger.
Maybe you missed when I was specifying average consumers, but that’s what I am talking about. I don’t see this service entering mainstream consumer computing for awhile. That being said I don’t think Apple is interested in this for companies either though. They seem to want to focus on powerful low-latency local computing.

Yes eventually I do think cloud computing is the future, and this will all change decades from now, but I think we’re far from our laptops and phones (as consumers) accessing a cloud service to offload our cpu/gpu processing. There’s just too much latency/poor connections/issues people will run into to provide a stable service for consumers at large.

When this does become mainstream, I imagine it’s due to apples desire to make the technology as small, thin, and transparent as possible. If you are wearing Apple glass for example, maybe one day your glasses will connect to a high-speed, low latency network in the cloud to do all the demanding computational processing. This would allow Apple glass to provide powerful features without compromising on the physical design.
 
This is the most interesting part. They can get any hardware system they want and make a custom macOS-ish OS for that use case.

Pragmatically, no. The market for providing mac based CI and cloud services already exists. It has been around for over a decade. Macstadium (maccoloc) , Amazon , and others have already established this market. And Apple has regulated the viruatilization ( and last year futher clarifying the "leasing services"

https://9to5mac.com/2020/11/11/maco...ces-like-macstadium-out-of-a-legal-gray-area/

"... Apple software and hardware must be leased “in its entirety to and individual or organization” ... )

Apple would have to play by same rules they impose on the already established players. Given vendors in the established market "onerous" rules while Apple tap dances around them in a move to take share from those vendors is a pretty straightforward anticompetitive practices legal cases.

Apple is already losing a couple of those kinds of cases. Creating yet another one that would be on, at best, a super wishy-washy foundation would be a very bad move. Building a larger number of failed legal cases tends to bring more failed legal cases.

Apple goes to large efforts to license Mac hardware and macOS as a unified whole, singular "thing". Apple going into the hackintosh business only opens up the rest of their non-cloud services busines to more hackintoshes with legal cover. Extremely likely, it isn't worth the money to Apple to provide "as cheap as possible" cloud services. Apple can just deploy "whole complete macs" just like the other folks do. Apple can pass along the costs to the folks doing the leasing just like the current vendors do. Across the board that just makes Apple more money ( Apple makes more, their partners providing services make more , retailers selling individual system to users make more , etc. etc. )

It is a huge mistake to look at this context and try to map whaFt AWS, Google , Azure, etc. and others in the generic cloud computation services into what Apple "has to" do. Or even what they want to do. Defacto Xcode Cloud is going to sell more Macs to folks you lease them long enough. And after that "long enough" pay more ( until Apple swaps out a new Mac "box" to be paid for). In large part, this is a way to sell more Macs to developers on an extended installment plan ( there is flexibility to drop out too. So really getting a larger set of folks paying for it collectively. For developers with "rollercoaster" revenue inflows there is lots of synergy there. ).

They don’t for now, but not for future.

More powerful future Minis could work just fine as replacements for the current much larger Intel Mac Pro systems over time. However, if folks are leasing the Intel Mac Pros at profitiable rates there isn't a large reason to chuck those systems prematurely.

If Apple did a "half sized" ( drop the tower in half) M-series Mac Pro then racked version probably could be double racked in the same space as a single Intel Mac Pro ( a side-by-side rackmount attachment. ) Over time Apple could double the density.

But they would also probably need to deploy Mini's as a more affordable leasing option. That is a more affordable "whole" system buy so would be cheaper to lease. Also cheaper to operate (power/cooling) and rack also. ( "story context backround" picture from 9to5 article linked in above.

Macstadium.jpg


The current Mac Pro chassis is a bit bloated for deploymemt for "everybody possible" for XCode cloud. that would just be a "money grab" by Apple if it was the only option.
 
Apple would have to play by same rules they impose on the already established players.
No, they don't. They can to whatever they want without the limitation. They are doing this all the time on their platform, like iOS. Their own apps usually carries capabilities that 3rd party cannot have, some even violates their own App Store review guidelines. This is a sad way to make sure stuff "provided by Apple has to be the best". Have to lease a whole Mac Mini for CI/CD is not acceptable in most cases, if not all cases. Apple's minimum leasing period of 24 hours introduced after Big Sur EULA has been a huge problem for most CI/CD providers. Even GitHub, now owned by Microsoft, struggles to provide a Big Sur based CI until August 2021. It is almost 10 months after Big Sur release.
 
I think Apple is likely to move everything they do to a subscription model sooner or later. I have long thought this for their consumer hardware but it makes perfect sense for any server offering they may come up with too.
Could be particularly interesting as a Mac Pro offering. You buy/lease a tower that can do a certain amount of local work but comes along with cloud computing credits to do a whole bunch more. Instead of paying a ton more for 128 cores instead of 64, you can get as many as you want from the cloud.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.