It seems Microsoft are now requiring a minimum of 16GB RAM to run their LLM locally.
I suspect within a year, possibly already at WWDC, Apple will announce their own locally run LLM.
If they're going to compete with Microsoft and others they will most likely need just as much RAM.
So... good luck to to all those that claim 8GB is enough, your laptops will be outdated very very soon. 🤷♂️
PS. It seems the M4 is capable of 38 TOPS, so in theory pretty much good enough for an LLM.
arstechnica.com
I suspect within a year, possibly already at WWDC, Apple will announce their own locally run LLM.
If they're going to compete with Microsoft and others they will most likely need just as much RAM.
So... good luck to to all those that claim 8GB is enough, your laptops will be outdated very very soon. 🤷♂️
PS. It seems the M4 is capable of 38 TOPS, so in theory pretty much good enough for an LLM.

Microsoft’s “Copilot+” AI PC requirements are embarrassing for Intel and AMD
Microsoft demands an NPU capable of at least 40 trillion operations per second.
