The point is, Apple can launch the service and get demand today.
A more important point is - what
is the demand for "Mac in the cloud"?
Obviously there is
some demand - several firms run racks of Mac Minis for hire - but the question is whether there is a
big enough demand to get Apple out of bed.
The big selling points of Macs are their UI and sleek-looking hardware which are irrelevant to a cloud "backend" server.
Going forward, the de-facto standard seems to be a backend running on Linux (which even Microsoft are now embracing) - often using open-source tools - and a client-side frontend running in a browser, or client-specific App. There's no particular appeal to making your backend Mac-specific (or Windows-specific), when a Linux version lets you shop around all of the cloud providers for the best hosting deal (...and, since MacOS is Unix, would probably run on Mac anyway, with minor tweaks).
If we're talking about "virtual desktop PCs" running a GUI and accessed via something like virtual desktop - there's a particular demand for that in the Windows world given Windows' dependence on its ability to run 20 year-old legacy software, and its huge presence in the corporate sector (...essentially, custom database frontends where the data is already in the cloud). I
can see that becoming immensely popular for home working (avoids employees having company data stored in their homes, and centralises all of the regulatory compliance box-ticking) and being the best long-term solution for Mac users who need to run Windows because of that one weird app that they need for work.
Not sure that the case for a "Virtual Mac" - verses proper client/server applications - is so strong - (a) Mac isn't big on running legacy code, with mass software extinctions every 5-10 years (Classic, PPC, Carbon, x86-32, to be followed by x86-64 in a few years, and lots of smaller extinctions due to security model changes etc...) - (b) "Pro" Mac users are biassed towards media production which means frequent access to huge local files (fancy uploading all your .raw files to the cloud before you start editing?) and dedicated local hardware - sure, there are some (collaborative) scenarios where media-production-in-the cloud makes sense, but that's not necessarily served by running FCP on a VM via Remote Desktop. Finally (c) I have a little picture of Apple letting people run MacOS-in-the-cloud via a RDP client on their Windows PC, Android tablet, Chromebook etc. Not. Apple could restrict it to clients running on Apple devices but, really, that defeats a lot of the object of having Mac-in-the-cloud.
Also, so far, we just have the M1 which is a system-on-a-chip designed for ultraportables and tablets. Sure, short-term it thrashes Intel's mobile offerings and is tempting "pro" Mac users, but it is not really a pro workstation or server chip. There's a reason why server hardware runs on Intel Xeon or AMD Epyc rather than laptop/desktop chips. There
are ARM-based server-optimised chips out there - and you
can get ARM servers in the cloud, but they're not Apple Silicon and won't run MacOS. M1X still has to prove itself, and still isn't really a server chip.
So, as I said, it's not just whether there is interest, it is whether there is
enough interest to make it worth Apple's while developing server-grade Apple Silicon chips.