So they are fully customizable chips - with “apis” to the cores for faster calculations on specific data?
Seems like a very cool thing to do!
Seems like a very cool thing to do!
Of course. The API must be public, thought. The Accelerate framework at Apple is a good example of how to use vector computation on both Intel and ARM chips. This framework has been recompiled and re-optimized for ARM since inception on iPhone chips. Same for Metal which gives you access to the GPU resources.So they are fully customizable chips - with “apis” to the cores for faster calculations on specific data?
Seems like a very cool thing to do!
Exactly. You get rid of the GUI. You can’t on macOS. So you always have to deal with the ”very” high memory consumptive GUI of macOS Even when using it in server mode. I’m not aware of any mean one can boot without UI on macOS.
PC Desktops and PC Servers is what you're thinking of, cmaier is thinking of all CPUs, and there are a lot of embedded CPU's out there!Quad-Core Intel i7... lol! J/k. Non ARM processors - I guess...
Looks like you can CTO/BTO 10Gbe on the M1 Minis. Kind of curious to see how they’d run with something like Sonnet’s rack mount PCIe enclosures? Add in some more networking & storage. Probably won’t win any prizes in performance, but still a kind of cool idea.M1 Mini for server ? Maybe in a year after our organization vets the OS and apps plus if Apple comes out with 10GBe or Fiber port ?
Currently I work at an organization where we have 20 of the Intel Mac Mini 2018 with optional 32GB RAM and 10GBe CTO/BTO. These 20 Minis are tied to each location's MDF for Apple caching service (Apple apps & OS update) as well as 12TB local user storage for iPad & iPhone users (authenticated VLAN users only).