Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

aljawad

macrumors regular
Original poster
Feb 1, 2004
151
5
Planet Earth
Something has been nagging my brain since WWDC: the repeated mentions of Linux during the keynote. What if Apple was hinting at something major in the pipeline?
For years the company attempted to enter the server market, think of the generations of Xserve from the G4 to Intel. Yet while the MacOS is an exquisite consumer/workstation operating system, it has failed to make a sizable entry into the enterprise market, a market that is now dominated by various flavors of Linux. Simply put, the enterprise runs on and demands Linux.
Now let us assume once the migration to Apple Silicon (AS) is complete, and Apple has an AS competitor to Intel’s Xeon that runs more efficiently and they can profit from products that sport it with a better margin than Xeon-based systems. Why stop just with future workstations like the MacPro or iMac Pro? Why not re-introduce a server branch of the Macintosh that not only runs MacOS Server, but also some Linux variant - and let it run natively instead of within a virtual environment?
It wouldn’t be the first time the company would market a product that runs a non-native operating system, remember the Apple Network Server (1996-7), which operated using IBMs AIX? Furthermore, why stop with just offering such a product to future clients when it could herald Apple’s mass entry into the enterprise cloud computer market?
Apple is the largest tech company around, yet currently it is missing from an enterprise market where the likes of Amazon AWS, Microsoft Azure, Google cloud and others are already massive players. In fact current Apple services like iCloud, the various app stores, Apple pay, ... etc. already operate on a non-native infrastructure (its estimated Apple leases Amazon AWS services for a 30 million dollar monthly fee), even though services amount to some 22% of the company’s revenue according to the last quarterly report. This figure is second only to the sales of the iPhone and is roughly three times that of iPad sales. Why should Apple limit its future only to the iPhone / iPad cash cows, along with its various computers that account for ~ 7% of end computer users?
Once Apple starts to use its own infrastructure for services, what will prevent the company from extending such services beyond the current consumer market into an even more lucrative enterprise market? Amazon has demonstrated what can be done with their AWS Graviton custom built processors (developed using ARM Neoverse 64-bit cores) for their own AWS services, which amount to ~ 14% of quarterly revenue.
Others who are working on super processors based on ARM technology include Fujitsu, the maker of the A64FX, the processor that powers Fugaku, the worlds fastest supercomputer. Also, recently Ampere announced a 128-core server processor to complement the 80-core Altra Max they launched earlier this year.
Apple potentially can develop a powerful competitor utilizing their own AS solutions to create scalable and megalithic systems that could be managed utilizing existing tools such as Macs, iPads and even iPhones. Apple will finally be able to augment an enterprise class of products and services to its existing consumer and professional products.
In short, the way I see it Apple didn’t invest billions of dollars to develop AS just to enhance the performance of laptops and desktops for the status quo users, but among the company’s longer term goals is the domination of the enterprise cloud computing market.
 
Last edited:
Apple said they won‘t support direct booting of alternate OSes. Apple is more of a consumer company as well. I doubt they’ll go back to offering server products that are different than their rack mount Pros.

I don’t think they even use macs for backend things on iTunes/iCloud etc.
 
Not sure Apple wants to compete with Azure, AWS, and GCS in the enterprise market. Pretty tough market that requires huge amounts of cash to build out infrastructure and multi-decade commitment to customers. It would be interesting though.
 
Why not re-introduce a server branch of the Macintosh that not only runs MacOS Server, but also some Linux variant - and let it run natively instead of within a virtual environment?
This is the most unlikely thing I’ve ever read related to Apple this year.

And I don’t think it’s a market that Apple targets. Too many players in there already, and too well established. MacOS would need a headless version, which could be incredible with Darwin and UNIX foundation, but very unlikely too.
 
  • Like
Reactions: throAU
You will not see a new apple branded server.

They want to push you to the cloud; either that or use third party servers.

Apple are not interested in putting out a server OS platform.

The server market has standardised on Linux for the cloud now, and there are already plenty of vendors who will sell you VM time in the cloud. Even traditional windows in-house enterprise shops are moving their stuff to the cloud these days (including my employer) because you simply can't get the same scale, ease of billing, etc. that cloud offers.
 
Given how Apple have been winding down their server software, I don't see them interested in this kind of thing.
 
It's pretty unlikely that Apple would try to compete with their own service providers (mostly AWS & some GCP - MS Azure in the past) by selling their own Infrastructure-as-a-Service. Apparently Apple spend about $350m per year on AWS services, which itself is half what they spent in 2017.

What Apple are doing is ramping up their own data center expansion (Project McQueen), but only for their own services (to reduce their huge cloud service bills), and not to sell to 3rd parties.

@aljawad: BTW, while Amazon Web Services may only be ~14% of Amazon's revenue, it represents about 55-60% of Amazon's profits. Apple is (I think) AWS' largest customer, probably followed by Netflix. Apple don't advertise this, and I expect there are other large customers who prefer to remain anonymous (e.g. US DoD, NSA etc.)
 
Something has been nagging my brain since WWDC: the repeated mentions of Linux during the keynote. What if Apple was hinting at something major in the pipeline?

Linux is extremely useful for people doing web- and other server-side development - for which it is generally an advantage to have it as a virtual machine that can be given its own IP address, cloned, snapshotted, reverted etc. for testing. But you deploy to a "real" Linux machine (or, these days, a virtual Linux instance somewhere in cloud land). Actually, MacOS is darned close to Linux anyway if you're not developing for the Mac GUI, and most of the server-side software has been ported. So you can do 90% of the typical web development work on MacOS.

I really don't see any incentive, in this day and age, for using a Mac as a production server. The Mac's advantages are the nice GUI (irrelevant on a server), Unix (but so is Linux, practically if not technically*) and the industrial design (it's a server - who is going to look at it?) and that it can run thing like Office and Adobe CS (...which may be relevant to development/web authoring but aren't needed on the server).

When XServe came out, Linux wasn't at its current stage of development and servers were running Windows Server, commercial Unix, Knobbled Knitware or something proprietary from IBM - which all had rather expensive per user licensing fees. XServe wasn't free, but it didn't require per-user licensing, which was a big deal. Also, at the time, Macs were still heavily reliant on nonstandard protocols for file-sharing and the like (SMB was MS proprietary at the time), so the XServe (...or a Mac Mini running OS X server) was about the only choice for Mac workgroup/departmental servers. The whole "web services" thing barely existed in it's present form.

Now, that's changed - Linux is industry-standard, doesn't even have a per-installation license (...just as little or as much tech support as you want to pay for) and is even winning against Windows Server, leaving Mac OS as the awkward licensing case. It runs on cheap generic hardware that is available to fit all requirements and budgets. On top of that, servers are increasingly cloud-based and out-sourced, which is only going to grow in the long term. Macs, meanwhile, have dropped most of the proprietary protocols for things like file-sharing (SMB has become effectively free) and AFAIK even the Mac/iDevice commercial management tools from IBM and the like run on Linux.

Apple dropped the XServe because the market went away - and it hasn't come back.

...also, cloud providers won't be buying Apple Silicon to put in servers for the same reason that Apple won't be buying Amazon Gravition chips to put in Macs: Laptop/workstation processors are designed for laptop/workstation loads and server processors are designed for server loads. That will be more true with ARM, where the pick-and-mix licensing makes it easier to make specialist chips - but it's even true of Intel: The Xeon-W chips in the (i)Mac Pro are not necessarily the ones you'd put in a server. So, Apple could design an Apple Silicon for Servers (er.., hang on, not a good name :)) if they wanted to randomly and inexplicable branch out - but it would be a strange decision.


(*Technically MacOS is Unix but Linux isn't Unix - however, that's only because 'Unix' these days is just a certification and trademark-licensing deal that isn't really compatible with the Linux open source licensing model).

Apple said they won‘t support direct booting of alternate OSes.

(aside)... it's increasingly common with PC servers that the only thing you "direct boot" is the host OS for your hypervisor, and everything else runs as a virtual machine. I wouldn't be surprised if, long term, we see bare-metal hypervisors as a replacement for standardised firmware, and every OS runs as a virtual machine.
 
I really don't see any incentive, in this day and age, for using a Mac as a production server. The Mac's advantages are the nice GUI (irrelevant on a server), Unix (but so is Linux, practically if not technically*) and the industrial design (it's a server - who is going to look at it?) and that it can run thing like Office and Adobe CS (...which may be relevant to development/web authoring but aren't needed on the server).

When XServe came out, Linux wasn't at its current stage of development and servers were running Windows Server, commercial Unix, Knobbled Knitware or something proprietary from IBM - which all had rather expensive per user licensing fees. XServe wasn't free, but it didn't require per-user licensing, which was a big deal. Also, at the time, Macs were still heavily reliant on nonstandard protocols for file-sharing and the like (SMB was MS proprietary at the time), so the XServe (...or a Mac Mini running OS X server) was about the only choice for Mac workgroup/departmental servers. The whole "web services" thing barely existed in it's present form.

Now, that's changed - Linux is industry-standard, doesn't even have a per-installation license (...just as little or as much tech support as you want to pay for) and is even winning against Windows Server, leaving Mac OS as the awkward licensing case. It runs on cheap generic hardware that is available to fit all requirements and budgets. On top of that, servers are increasingly cloud-based and out-sourced, which is only going to grow in the long term. Macs, meanwhile, have dropped most of the proprietary protocols for things like file-sharing (SMB has become effectively free) and AFAIK even the Mac/iDevice commercial management tools from IBM and the like run on Linux.

Well put. For a specialized server, the ability to customize the various aspects of the core OS is a huge plus, and Linux excels at it. I don't see much point in competing with it in this market. For user space, nice UI and a well put together opinionated base distribution that offers consistent experience is more important.
 
IDK... the right tools for the job at hand?

As someone who has actually owned Xserves, and NeXTs and a boatload of other *nix boxes being used as servers. What worked back in the olden daze was Sun. It made The Internet Go! That stopped being the case a long time ago, and what just works is Linux. What just works for load balancing/firewall assist is *BSD.

Just like Linux will take over the desktop! For sure, by 2040 this time, no really, we mean it! ... uhm, no. No it won't. Game over. Linux will "take over the desktop!" never.

Same is true of Apple producing enterprise servers. I don't think there is anybody at all out there that urgently needs this. Never say never, but this is never gonna happen.

There was a time, when dinosaurs roamed the Earth, when Apple did use Solaris (next iteration of SunOS) and OS/X to run servers, but even their own datacenters have run Linux for a very long time.
 
  • Like
Reactions: pldelisle
Check out the NuVia - Apple lawsuit - From the stuff coming from that, it sounds like Apple was ignoring server opportunities for Apple Silicon (at least for the short term - we'll see how it scales up to Mac Pros, Mac Minis).
Interestingly Anand Shimpi was possibly linked to this lawsuit
 
It's pretty unlikely that Apple would try to compete with their own service providers (mostly AWS & some GCP - MS Azure in the past) by selling their own Infrastructure-as-a-Service. Apparently Apple spend about $350m per year on AWS services, which itself is half what they spent in 2017.

What Apple are doing is ramping up their own data center expansion (Project McQueen), but only for their own services (to reduce their huge cloud service bills), and not to sell to 3rd parties.

@aljawad: BTW, while Amazon Web Services may only be ~14% of Amazon's revenue, it represents about 55-60% of Amazon's profits. Apple is (I think) AWS' largest customer, probably followed by Netflix. Apple don't advertise this, and I expect there are other large customers who prefer to remain anonymous (e.g. US DoD, NSA etc.)

That exemplifies my point: Apple would be its own main client for an AS-based server to free itself from relying on 3rd parties for its services, an expanding branch of revenue for the company that eventually could be developed further to encompass a full fledged enterprise cloud service. Stand alone products based on the technology could trickle down to the clients. For business growth there is no such thing as relying on the status quo model and new venues must be explored.
 
Apple said they won‘t support direct booting of alternate OSes.
They can change their mind. They may not need to. While they have let MacOS Server atrophy completely, they could look to make a hypervisor type option like ESX instead.

I don’t think they even use macs for backend things on iTunes/iCloud etc.
They don't, but they want like to.


Not sure Apple wants to compete with Azure, AWS, and GCS in the enterprise market. Pretty tough market that requires huge amounts of cash to build out infrastructure and multi-decade commitment to customers. It would be interesting though.

They won't be looking to compete with AWS or Azure, too high maintenance. They might look to sell to AWS and Azure though.

And I don’t think it’s a market that Apple targets. Too many players in there already, and too well established. MacOS would need a headless version, which could be incredible with Darwin and UNIX foundation, but very unlikely too.

Its not a market Apple targets. Apple targets markets it can disrupt.
If Apple has a big, high performance workstation class server chip in the works that gives amazing performance per watt, they might be in a position to disrupt this market. Don't forget they can also customise their CPUs in other ways and include features that more general purpose CPUs don't have. That was the whole point of Apple Silicon.


Given how Apple have been winding down their server software, I don't see them interested in this kind of thing.

They definitely won't be interested in Server software.

I really don't see any incentive, in this day and age, for using a Mac as a production server. The Mac's industrial design (it's a server - who is going to look at it?) isn't needed on the server).

Those Xserves were super well built though. There are a lot of them still running.

When XServe came out, Linux wasn't at its current stage of development and servers were running Windows Server, commercial Unix, Knobbled Knitware or something proprietary from IBM - which all had rather expensive per user licensing fees. XServe wasn't free, but it didn't require per-user licensing, which was a big deal.

Apple dropped the XServe because the market went away - and it hasn't come back.

The Xserve's licensing was how they tried to position it against Windows Server but it also had an architecture advantage until it switched to Intel. The Altivec units in the G4 and G5, plus 64-bit for G5 meant they could do certain tasks a lot better than Intel CPUs back then. When they went to Xeon, they were offering the same hardware as Dell and HP, except it was usually 6-18 months out of date and it cost twice as much. It was doomed at that point.
Apple Silicon potentially reintroduces the architectural advantage. If not with specialised logic, with the much lower power consumption which is a massive, massive concern for any data centre.
If Amazon, Google or Microsoft can save 10% on their electricity and cooling bills, they will bite your hand off. 20% and they'll start handing over their first borns with truckloads of cash.

Laptop/workstation processors are designed for laptop/workstation loads and server processors are designed for server loads. That will be more true with ARM, where the pick-and-mix licensing makes it easier to make specialist chips - but it's even true of Intel: The Xeon-W chips in the (i)Mac Pro are not necessarily the ones you'd put in a server. So, Apple could design an Apple Silicon for Servers (er.., hang on, not a good name :)) if they wanted to randomly and inexplicable branch out - but it would be a strange decision.

I'm not sure it would. If everyone else is using generic (which you can spank for performance at a fraction of the power consumption) and you are offering bespoke (ish), there's a market right there. Of course, they could just develop them for in-house and if they're good, then might as well sell a few.
[automerge]1595765686[/automerge]
Someone mentioned 10 year maintenance agreements. Thats on the software. Build a hypervisor and let the customer worry about the software. Let someone else provide the maintenance. The hardware in these places doesn't typically stick around anywhere near that long.
 
If Apple has a big, high performance workstation class server chip in the works that gives amazing performance per watt,

"workstation class" and "server class" aren't quite the same thing - even the Xeon range has different horses for different courses, and with ARM it's going to be moreso, because ARM's big advantage is that you can use the space/power freed up by having smaller cores to load up with GPUs and/or specialist accelerator hardware. Apple's priority for any high-end "Apple Silicon Pro" SoC has to be for pro video, graphics and audio work - they're just not in the server business.

If Amazon, Google or Microsoft can save 10% on their electricity and cooling bills, they will bite your hand off.

Too late. Plenty of other people are already riding that horse.
...etc.

Not sure what MS, Google Facebook et. al. are doing - but they're all big and ugly enough to "make their own" bespoke ARM Server chips or they could just buy from Amazon.

Apple's "chance to disrupt" is with ARM chips optimised for desktop/laptop use, which *is* currently a hole in the market, and something that Apple are uniquely able to do because they (a) unlike MS & Windows, they actually have the power to move the whole MacOS platform to ARM within a few years and (b) they're already well along that road with the iPad. In the server world, they'd be up against well established competition, and MacOS is not a selling point for a server.


Those Xserves were super well built though. There are a lot of them still running.

I've had plenty of generic PC hardware that ran for a decade, with uptimes measured in years, too: obviously, if you buy cheap you buy twice, but "generic" doesn't have to mean "the cheapest rubbish you can get". There's nothing magical about Apple hardware in the long-term reliability stakes.

In fact, most of these servers are going to be safely bolted into racks in climate-controlled rooms, so it doesn't matter if they are made from the cheapest Chineseium available as long as the soldering is good and the heatsink paste is properly applied etc. - the truth is that the "classic" Mac Pro and XServe were somewhat over-engineered, while the new Mac Pro takes the concept of "over-engineered" to magical and courageous new heights...
 
Too late. Plenty of other people are already riding that horse.
...etc.

That Ampere chip runs 210W which is interesting because 20 A12 chips would run about the same and also give you 80 cores. Not counting the efficiency or GPU cores.
The Huawei onerous 180W so given it has fewer cores its roughly in line. Can't see western companies scrambling to buy those though since they will likely be banned from military or other sensitive use cases.
It'll be interesting to see where Apple's offering comes in.


I've had plenty of generic PC hardware that ran for a decade, with uptimes measured in years, too: obviously, if you buy cheap you buy twice, but "generic" doesn't have to mean "the cheapest rubbish you can get". There's nothing magical about Apple hardware in the long-term reliability stakes.

Never said it was magic, just that it was well built.

In fact, most of these servers are going to be safely bolted into racks in climate-controlled rooms, so it doesn't matter if they are made from the cheapest Chineseium available as long as the soldering is good and the heatsink paste is properly applied etc. - the truth is that the "classic" Mac Pro and XServe were somewhat over-engineered, while the new Mac Pro takes the concept of "over-engineered" to magical and courageous new heights...

So now they are magic?
 
That Ampere chip runs 210W which is interesting because 20 A12 chips would run about the same and also give you 80 cores.

...without the speed advantage of all the cores being on the same chip - not to mention other issues such as i/o bandwidth, cache sizes, whether the various on-chip acceleration technology was appropriate to the application, not wanting 20 GPUs or 20 secure enclaves etc.

Anyhow, you're missing the point: it's not that Apple couldn't produce a server-optimised chip, it's that it would be a pointless distraction from their main priorities which are #1: make the best chip for iPhone, #2 make the best chip for the MacBook range and #3: once #1 and #2 are out of the way, maybe make a "pro" chip for video/audio production work.

So now they are magic?

"magic" is wasted when it is locked away in the highest room of the tallest tower a rack buried deep in a data centre. If you want a server then elegant minimalist cases robot-hewn from solid aluminium and pretty (discuss) graphical user interfaces are irrelevant. All you need in a server is good-quality generic hardware. Or, increasingly in this day and age, an account with AWS/Azure/whoever and an internet connection...
 
  • Like
Reactions: pldelisle
While a server chip is fundamentally different from a workstation CPU, at least they are similar in scale and complexity. Bodes well for the next Mac Pro.
 
Just write your "gui-less" server apps using swift, and run them on a macmini in a colo.
And then one day if you get some money for whatever app you producing, port the code to a native linux distro, using swift on linux, it's open source, this is what Apple intended, at least IMHO.

You won't get the gui like I hinted at, but for server apps you don't need to. And you can build a linux farm no problem, using swift on linux. Then create a server viewer app to check into the linux farm that runs on a mac...

easy peasy!

Night, and laters...
 
i expect that if Apple does decide to build servers for internal use, it will not be selling them commercially, much like Amazon. This will help improve the security of the servers. They could, in fact, build in detection for any hardware alterations, either in software or a hardware/software combination.
 
  • Like
Reactions: throAU
Something has been nagging my brain since WWDC: the repeated mentions of Linux during the keynote. What if Apple was hinting at something major in the pipeline?
For years the company attempted to enter the server market, think of the generations of Xserve from the G4 to Intel. Yet while the MacOS is an exquisite consumer/workstation operating system, it has failed to make a sizable entry into the enterprise market, a market that is now dominated by various flavors of Linux. Simply put, the enterprise runs on and demands Linux.
Now let us assume once the migration to Apple Silicon (AS) is complete, and Apple has an AS competitor to Intel’s Xeon that runs more efficiently and they can profit from products that sport it with a better margin than Xeon-based systems. Why stop just with future workstations like the MacPro or iMac Pro? Why not re-introduce a server branch of the Macintosh that not only runs MacOS Server, but also some Linux variant - and let it run natively instead of within a virtual environment?
It wouldn’t be the first time the company would market a product that runs a non-native operating system, remember the Apple Network Server (1996-7), which operated using IBMs AIX? Furthermore, why stop with just offering such a product to future clients when it could herald Apple’s mass entry into the enterprise cloud computer market?
Apple is the largest tech company around, yet currently it is missing from an enterprise market where the likes of Amazon AWS, Microsoft Azure, Google cloud and others are already massive players. In fact current Apple services like iCloud, the various app stores, Apple pay, ... etc. already operate on a non-native infrastructure (its estimated Apple leases Amazon AWS services for a 30 million dollar monthly fee), even though services amount to some 22% of the company’s revenue according to the last quarterly report. This figure is second only to the sales of the iPhone and is roughly three times that of iPad sales. Why should Apple limit its future only to the iPhone / iPad cash cows, along with its various computers that account for ~ 7% of end computer users?
Once Apple starts to use its own infrastructure for services, what will prevent the company from extending such services beyond the current consumer market into an even more lucrative enterprise market? Amazon has demonstrated what can be done with their AWS Graviton custom built processors (developed using ARM Neoverse 64-bit cores) for their own AWS services, which amount to ~ 14% of quarterly revenue.
Others who are working on super processors based on ARM technology include Fujitsu, the maker of the A64FX, the processor that powers Fugaku, the worlds fastest supercomputer. Also, recently Ampere announced a 128-core server processor to complement the 80-core Altra Max they launched earlier this year.
Apple potentially can develop a powerful competitor utilizing their own AS solutions to create scalable and megalithic systems that could be managed utilizing existing tools such as Macs, iPads and even iPhones. Apple will finally be able to augment an enterprise class of products and services to its existing consumer and professional products.
In short, the way I see it Apple didn’t invest billions of dollars to develop AS just to enhance the performance of laptops and desktops for the status quo users, but among the company’s longer term goals is the domination of the enterprise cloud computing market.
I love the concept and see a different path for Apple. Apple's advantage with M-series is power per watt. This will improve dramatically over time, especially with TSMC and may include new chemistry and physics, going beyond new silicon process and multi-chip fabrication. We may see a rebirth of the server, especially for AI applications where the could may not be such a great fit. Smart developers will develop tools and applications for high-end Mac. No other chip maker/foundry partnership will be able to leverage the volume that Apple has.
 
Read this thread started by me: https://forums.macrumors.com/thread...t-a-40-core-soc-for-mac-pro-now-what.2306486/

It's quite clear that more and more tasks that used to be done locally will move to the cloud. Apple can't escape this trend. I think it's not a matter of if, but when Apple will put Apple Silicon in the cloud and rent it out.

Now how would it look like?

I think in the beginning, Apple will try to integrate cloud into local devices. For example, you're on your Macbook Air, you do one swipe on your touchpad, and you're now on the desktop of your 128 CPU-core 256 GPU-core Apple Silicon SoC running in the cloud. Knowing Apple, they will try to integrate the local and cloud experience into something seamless.

I can then see Apple move into something that is headless, ala AWS/Google Cloud eventually. But to start, I think Apple will only offer Apple Silicon Cloud to Apple users.

PS. I don't think Apple will bother supporting server software on local hardware. But I do think Apple will put Apple Silicon in the cloud.
 
Last edited:
Read this thread started by me: https://forums.macrumors.com/thread...t-a-40-core-soc-for-mac-pro-now-what.2306486/

It's quite clear that more and more tasks that used to be done locally will move to the cloud. Apple can't escape this trend. I think it's not a matter of if, but when Apple will put Apple Silicon in the cloud and rent it out.

Now how would it look like?

I think in the beginning, Apple will try to integrate cloud into local devices. For example, you're on your Macbook Air, you do one swipe on your touchpad, and you're now on the desktop of your 128 CPU-core 256 GPU-core Apple Silicon SoC running in the cloud. Knowing Apple, they will try to integrate the local and cloud experience into something seamless.

I can then see Apple move into something that is headless, ala AWS/Google Cloud eventually. But to start, I think Apple will only offer Apple Silicon Cloud to Apple users.
For the average consumer, Wouldn’t internet speeds be an issue? Of course if you have gigabit fiber I could see this being decent, but at least in the US most people have terrible internet connection options. I don’t see cloud computing becoming mainstream for consumers until we have widespread extremely fast internet.
 
  • Like
Reactions: bobcomer
For the average consumer, Wouldn’t internet speeds be an issue? Of course if you have gigabit fiber I could see this being decent, but at least in the US most people have terrible internet connection options. I don’t see cloud computing becoming mainstream for consumers until we have widespread extremely fast internet.
Internet connections get faster over time. Anyone who needs to rent Apple Silicon in the cloud would have good internet.

Today, you can stream AAA games from the cloud, use Microsoft Windows hosted in the cloud, and even accelerate Google Chrome from the cloud. Just imagine in 5 years.

Apple isn't going to ignore all the advantages of the cloud. And when they want to offer more cloud services like the above, they aren't going to use Intel/AMD chips. They're going to use Apple Silicon. Hence, it's not a matter of if, but when Apple Silicon gets put in the cloud.
 
I guess the cloud makes sense if most of the data I need to process are stored in the cloud and it's not extremely time sensitive.

The tech landscape tend to go in cycles, from centralised to de-centralised and back again. It appears to be a function of computational power and networking technology advancement, and lately battery technology.

What I'm seeing now is that cloud is the in thing, but it appears that the shift is going back to the de-centralised model, as least from what I see Apple doing.
 
  • Like
Reactions: altaic
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.