Apple is running macOS on servers in datacenters. They have a lot of things that rely on macOS, such as QuickTime video/audio encoding, iOS/Mac App Store services and other things. They're not running the regular macOS GUI and stuff, but they do run a stripped down version of macOS on Supermicro hardware, IIRC.
Do you have any links? QuickTime streaming has not been dependent on macOS for a while, and Apple has moved off of QTSS to H.264 live streaming which is available on many platforms. iOS and Mac App Store services don't really require the Mac. Serving downloads doesn't need a Mac server. Even code signing on developer upload probably doesn't really require a Mac host.
[automerge]1571678473[/automerge]
Apple still "collect" metadata, but they "anonimize" it before sending it to Apple (or China).
That's the issue.
Anonymized metadata is ok for training the onboard models, but it's not useful for training live models based on user behavior.
So Apple doesn't need giant cloud farms for machine learning because they're not doing very much in real time.
No, Siri Was the first victim from the nVidia-xit they spent a year trying to move to ROCm (AMD) even Xeon-phi (bare openCL), the apple beheaded their AI CTO and poached Google's AI CTO, which reinstalled CUDA training workflow and managed to get an macOS/iOS version of tensorflow lite where you can run CUDA trained models.
I don't think there is any debate that Apple allows importing of outside models. CoreML includes Tensorflow support.
But again, that's very different than Apple having any need for massive server farms for machine learning.
On device training it's almost a joke unless you're training very trivial models.
Maybe, but it's what Apple is doing.
The sort of stuff Apple wants to do is figure out where you are going to drive your car next, or what app you're going to open next. Or what your cat's face looks like. It's not exactly complicated stuff.
Apple spend a lot of money on create ML and core ML to give macOS/iOS something to toy with, but useless for creativity or research, same thing is doing Google for a Android where before apple included npu/tpu (first. So to include NPU/TPU was the original pixel, which included it on a discreet chip.
Apple isn't doing research oriented tasks with machine learning (as in, they're not building products for the research industry.)
The Mac Pro might be a machine aimed at researchers, but Apple is going to leave the software stack to other people. We could talk through the whole Apple and CUDA thing again but I think we'd just be going in circles about something that's already been talked about to death...
But Apple is just fine hanging out waiting to see if Tensorflow will add a Metal backend. And I think they might do more around multiple Afterburner cards and these sorts of tasks. It's really hard to see Apple becoming more CUDA centric with all their hardware going in directions away from CUDA.