Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

deconstruct60

macrumors G5
Mar 10, 2009
12,311
3,902
for an muilt user rack mount sever running VM's yes.

If you are primiarly running VMs , you can log into the hypervisor and do "under the instance" management about just as effectively as IMPI can under instances that are running 'raw iron' on some other system.
 

Joe The Dragon

macrumors 65816
Jul 26, 2006
1,027
475
If you are primiarly running VMs , you can log into the hypervisor and do "under the instance" management about just as effectively as IMPI can under instances that are running 'raw iron' on some other system.
saying that apples hardware sucks for any real server use.
NO IPMI
NO Storage hot swap bays
Base Storage change needs an 2th system to reload it and are forced to wipe
low max ram
-------
apple does not have there own hypervisor os
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,571
5,325

Pu provided the analysis based on supply chain checks in a new note to investors seen by MacRumors. Foxconn is said to currently be assembling Apple AI servers that contain the ‌M2‌ Ultra, with plans to assemble AI servers powered by the M4 chip in late 2025. Last month, a reputable source on Weibo said that Apple was working on processors for its own AI servers made with TSMC's 3nm process, targeting mass production by the second half of 2025, which may line up with this report about M4-powered AI servers.
Here we go. I think this is how Apple can make Ultra/Extreme chips worth it.

I'm guessing the M2 Ultra is just Apple testing. The M4 remark is likely to be the M4 Ultra or Extreme. I doubt the base M4 will be used.

My current prediction is that Apple will eventually unveil a premium Siri LLM service that costs $10-$20/month. They might want the free version to only run locally and the premium version will be more powerful and will require an Ultra/Extreme SoC in the cloud. Both versions will be optimized for Apple Silicon.

Apple can avoid Nvidia prices, optimize for their own chips, and be able to actually acquire enough AI chips.
 
Last edited:

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,571
5,325
On second thought, this "AI" server chip by Apple might not be a SoC. It could be a chip that only contains the Neural Engine but significantly larger than any NPU on Apple Silicon. IE, instead of a 16-core ANE, it's 256-core or something similar.

It'd be similar to Google's TPU, Microsoft's Maia 100, or Amazon's Inferentia. Basically all huge NPUs.

It's possible that Apple could do both: using full SoCs in their data centers and using a custom ANE chip.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.