Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

taliz

macrumors regular
Original poster
Jun 10, 2006
184
241
It seems Microsoft are now requiring a minimum of 16GB RAM to run their LLM locally.
I suspect within a year, possibly already at WWDC, Apple will announce their own locally run LLM.
If they're going to compete with Microsoft and others they will most likely need just as much RAM.

So... good luck to to all those that claim 8GB is enough, your laptops will be outdated very very soon. 🤷‍♂️

PS. It seems the M4 is capable of 38 TOPS, so in theory pretty much good enough for an LLM.

 
  • Like
Reactions: turbineseaplane
Hate to sound like a broken record but... locally run LLMs are only important to people for whom locally run LLMs are important. Buy enough RAM for YOUR purposes, let other people buy enough RAM for THEIR purposes.
 
  • Like
Reactions: Chuckeee and kpluck
According to a MacRumors article from last December, Apple would rather wear out your device's flash memory to run LLMs than give people a higher base RAM. ☹️

 
Apple actually published a paper earlier this year on running LLMs with limited RAM by utilizing the SSD as a slower tier of memory.

 
  • Like
Reactions: chown33
It seems Microsoft are now requiring a minimum of 16GB RAM to run their LLM locally.
I suspect within a year, possibly already at WWDC, Apple will announce their own locally run LLM.
If they're going to compete with Microsoft and others they will most likely need just as much RAM.

So... good luck to to all those that claim 8GB is enough, your laptops will be outdated very very soon. 🤷‍♂️

PS. It seems the M4 is capable of 38 TOPS, so in theory pretty much good enough for an LLM.

One thing I would add is that while AI likely won't force Apple to update their stingy base specs market competition most certainly will. Prior to Copilot+ PCs that were just introduced most of the Surface line also started at 8GB/256GB. Apple doesn't spec and price their devices in a vacuum, they look at what the competition is offering. If a large number of prominent premium PC laptop vendors raise the RAM floor to 16GB then Apple will eventually have to do the same (or meet them halfway at 12GB).
 
  • Like
Reactions: EedyBeedyBeeps
According to a MacRumors article from last December, Apple would rather wear out your device's flash memory to run LLMs than give people a higher base RAM. ☹️

“Wear out”

When was the last time you saw an SSD on a laptop actually die from excessive use? Come on.
 
According to a MacRumors article from last December, Apple would rather wear out your device's flash memory to run LLMs than give people a higher base RAM. ☹️

Do you know how much writing to flash occurs in Apple's improved algorithm?

I glanced at the paper and it doesn't mention repeated writing of flash, only reading the model parameters from flash. There would apparently be an initial writing of the model data so it's in the structure the algorithm needs, but after that, it seems like it's read-only.

I don't know enough about how LLMs operate internally to know whether writing model parameters is frequent or not. Judging by what I can understand from the paper, it seems to be mostly a more efficient storage and RAM representation optimized for reading and computation. For example, there's a mention of normal OS disk caching of reads, which affects other processes using the cache, and requires access to file-system structures in flash. This can be obviated by reading sequential flash blocks directly into the LLM's working RAM space.
 
  • Like
Reactions: Chuckeee
“Wear out”

When was the last time you saw an SSD on a laptop actually die from excessive use? Come on.
The SSD in my 4GB MBA died after 4 years. Was swapping like crazy.
 
The SSD in my 4GB MBA died after 4 years. Was swapping like crazy.
And you know it died because of excessive writes..how exactly? Modern SSDs have TBW ratings that are ridiculously high, and often can actually function much further than the rating.
 
  • Like
Reactions: Chuckeee
And you know it died because of excessive writes..how exactly? Modern SSDs have TBW ratings that are ridiculously high, and often can actually function much further than the rating.
It's the opposite actually. Modern SSDs are usually TLC or QLC.
Old SSDs were SLC and could take A LOT more writes.

/storage engineer
 
  • Like
Reactions: DOD250 and schnaps
Using SSD for the LLM is better than using RAM, as it is for most part read only. SSDs are for most cases fast enough. If you use all your RAM for the LLM you will have to swap everything else to the SSD and that will really be detrimental for the TBW.
 
  • Like
Reactions: Chuckeee
Using SSD for the LLM is better than using RAM, as it is for most part read only. SSDs are for most cases fast enough. If you use all your RAM for the LLM you will have to swap everything else to the SSD and that will really be detrimental for the TBW.
Those poor SSD's are already getting hammered with swap from simple web browsing and know this. Oh the humanity!
 
  • Like
Reactions: DOD250
You know that joke meme, men would rather ____ than go to therapy? That's Apple with upping base RAM
 
M4 iPad still has 8 GB. I am sure, that Apple will somehow defend himself to keep 8 GB in Macs as base.
Assuming Apple designs their products with an end in mind, my guess is that they will find a way for whatever new AI feature they introduce to still be able to run well on 8gb of ram.
 
Apple has been using AI for a long time in their products. For large language models, I think it's just a flash in the pan, and this generation's bitcoins.
 
  • Like
Reactions: Chuckeee
It's the opposite actually. Modern SSDs are usually TLC or QLC.
Old SSDs were SLC and could take A LOT more writes.

/storage engineer
Again I ask, how did you know your SSD died from excessive use? Unless of course as a storage engineer this was just a hobby of yours to watch it die?

Even with TLC or QLC how often are we seeing SSD’s die in consumer laptops from excessive use? And why do you think Apple hasn’t considered this in their approach to their AI models?
 
  • Like
Reactions: Chuckeee
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.