Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The 256gb SSD on the base model is more egregious than the RAM i think, albeit the upgrade pricing for memory is a joke.

If you look at a Dell XPS 13 plus as an equivalent machine on the PC side you have to upgrade to the i7 chip to get 16gb RAM, the drive is 512gb though even on the base model.
I agree. The SSDs cost even less than the RAM for Apple to source! It makes the SSD upgrade prices even worse than the RAM upgrade prices. My relatives and I, even the elderly ones, could all benefit from 1TB, and shouldn't have to pay £400 for it. 😞
 
You're wrong about LLM's. People want privacy. LLM and AI usage will explode in the next couple of years, and privacy will dictate that a lot of it will be run locally.
LLM or AI will not be run on RAM! Thats not how RAM works, or how Data Processing works!

Google it, mate. You or I could place an order today for $19/20 a unit of 8GB in bulk. The exact same stuff Apple use. Don't you suspect they get a bigger discount than that? Or were you under the illusion their RAM was somehow unique?


They could socket it if they wished. Again, Google is your friend. The speed and energy differences are tiny. A year ago this wasn't an option, I'll give you that. They won't socket it again, though, because it wouldn't benefit them, as it saves a couple of dollars and their business model revolves around upselling RAM and storage. Storage, for example, does not benefit from being soldered. That's simply to lock us in, and save Apple approximately $1 on build costs.
News flash. You can’t compare swapable RAM prices with integrated, 'attached to chip' RAM because no one else does it. So again. You’re BS'ing about this. Asking for the price was rhetorical because I knew you couldn’t do it. Yet you managed to come up with an incomparable BS price. Good on you. There is no point continuing on talking with you when you’re just going to make stuff up for a reason I fail to see. Ciao.
 
I don’t notice the weight increase much at all. But the increase in screen size is very noticeable and welcomed.

Nothing wrong with the M2 13”. Just wanted a bigger screen :)
Why didn't you get the 16/512?
Any different in speed with the 13 and 15" 8GB?
 
Can you show us the price of the RAM that’s physically integrated into the SoC please? Ta.
The RAM chips are not integrated into the Apple SoC. They are connected via a PCB. The RAM chips are just off the shelf LPDDR5 SDRAM chips from HK Hynix and other vendors.

From the picture below, the top part is the SoC and the bottom are the two RAM chips. They are soldered on a single PCB.


1709937064624.png
 
  • Like
Reactions: ric22
The RAM chips are not integrated into the Apple SoC. They are connected via a PCB. The RAM chips are just off the shelf LPDDR5 SDRAM chips from HK Hynix and other vendors.

From the picture below, the top part is the SoC and the bottom are the two RAM chips. They are soldered on a single PCB.


View attachment 2357065
That’s why I said they were 'attached' to the SoC In other posts. Probably integrated 'with' is a better description. Notwithstanding, they are not 'soldered on' the ECB as previous versions. They are attached to reduce BUS & increased bandwidth To work with the GPU etc. Something soldered (ECB) or replaceable RAM cannot do.
 
LLM or AI will not be run on RAM! Thats not how RAM works, or how Data Processing works!
What a thing to write! Hilarious. They don't "run" on RAM but you seem under the impression RAM isn't essential for them to run at any acceptable speed. 🤣🤣
 
The RAM chips are not integrated into the Apple SoC. They are connected via a PCB. The RAM chips are just off the shelf LPDDR5 SDRAM chips from HK Hynix and other vendors.

From the picture below, the top part is the SoC and the bottom are the two RAM chips. They are soldered on a single PCB.


View attachment 2357065
I suspect the person you are replying to isn't being serious and is just trying to rile people up. :(
 
What a thing to write! Hilarious. They don't "run" on RAM but you seem under the impression RAM isn't essential for them to run at any acceptable speed. 🤣🤣
Yes. And you seem to know all about something that is not actually a thing… yet. You do not run LLM directly on a MacBook Air, and it’s not because of the RAM. But whatever.

I suspect the person you are replying to isn't being serious and is just trying to rile people up. :(
You clearly aren’t understanding the conversation. And you clearly don’t know how RAM is used. You think it will be the deciding factor of whether a 2024 Macbook Air will run LLM and AI 🤦🏻‍♂️
 
Last edited:
What a thing to write! Hilarious. They don't "run" on RAM but you seem under the impression RAM isn't essential for them to run at any acceptable speed. 🤣🤣
There was a recent article about running LLMs using the flash storage on your iPhone, which I assume could also apply to Macs and iPads.


It sounds like one way of getting around the limited ram on a device, and tracks with Apple's record of using software to optimise hardware performance, thereby allowing them to get by with using less specs and save on hardware margins.
 
  • Like
Reactions: steve09090
There was a recent article about running LLMs using the flash storage on your iPhone, which I assume could also apply to Macs and iPads.


It sounds like one way of getting around the limited ram on a device, and tracks with Apple's record of using software to optimise hardware performance, thereby allowing them to get by with using less specs and save on hardware margins.
Interesting. Thanks
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.