Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

betabeta

macrumors 6502a
Jun 28, 2013
916
210
Or maybe the iphone 16 pro will have 24GB ram, why not?
Well, they could, they also could charge an insane extra amount too. Would people pay 1500 for a new Ultra model that offers a full 100% on device real-time no delay responses AI? I would, but I'm an outlier.

What I don't like is they are very secret about memory, it should be upfront, with the ability to choose higher memory for all versions not just pro.
 
  • Like
Reactions: PeLaNo

betabeta

macrumors 6502a
Jun 28, 2013
916
210
What's interesting, in MKBHD's new video, he asked about the integration with OpenAI, and how much is done on device, etc.

He was told that almost everything is done on device (so this is clearly why RAM is a factor in things), from Apple built models, and that gives fast responses, etc (whereas the Android devices pushing most of it to the cloud will have slower responses and poor signal will make that even worse) - so to me, it's clear they're aiming for a best user experience with this..... they say that anything that is too complex to run on device, or outside of Apple's on device model's expertise it can try and use Apple's Private Cloud Compute.

If it thinks it's something ChatGPT can do better, it will ask the user if you want to use ChatGPT - and it will ask each individual time. It's not a 'say yes once and it will use ChatGPT whenever', it asks you every single time to confirm if you want to use ChatGPT - which is nice to know, if you never want to use ChatGPT, you just say no. Also, OpenAI is not allowed to ever store any of the requests and your IP address will be obscured, so OpenAI cannot connect multiple requests to profile you.
I think this remains to be seen, because they also said you can use your plus account and all those features....well those features have to be stored on OpenAI servers and linked to your account, your CC or payment options. So if you upload an image and have an openAI plus account, I'm 100% sure it will be linked to your OpenAI account. Now if Apple or OpenAI offer memory features to be stored on device and uploaded encrypted to OpenAI servers, OpenAI will get 50 million plus account users.

So I think a lot is still unknown, but it looks very promising, without Apple privacy and security would not be a thing with other companies, they only try to compete with Apple. That is my #1 reason I like Apple, privacy, choice of what I allow to be used by apps.
 

Ansath

Cancelled
Jun 9, 2018
4,791
5,249
I think this remains to be seen, because they also said you can use your plus account and all those features....well those features have to be stored on OpenAI servers and linked to your account, your CC or payment options. So if you upload an image and have an openAI plus account, I'm 100% sure it will be linked to your OpenAI account. Now if Apple or OpenAI offer memory features to be stored on device and uploaded encrypted to OpenAI servers, OpenAI will get 50 million plus account users.

So I think a lot is still unknown, but it looks very promising, without Apple privacy and security would not be a thing with other companies, they only try to compete with Apple. That is my #1 reason I like Apple, privacy, choice of what I allow to be used by apps.

Most consumers on iOS probably won’t have a plus account, so your number is abnormally high.

Also, most actions will be done on device or using the private cloud compute, and nothing will be stored. Apple have been very clear about what goes to ChatGPT after it asks you if you want to use it each time. Watch the keynote again? The platform SOTU and whatnot, to learn more.

Those plus users, obviously will have less of the privacy thing, but if you’re signed up to paid ChatGPT, then you’re already willing to give them your info, and are the outliers compared to those not paid up.
 

TonyC28

macrumors 68030
Aug 15, 2009
2,880
7,243
USA
What does everyone think will be the thing that is discovered in betas or announced with the new iPhones that will be in iOS 18 but only work on the new phones? Something AI perhaps? Maybe AI and camera-related to go along with the new cameras?
 

Eriamjh1138@DAN

macrumors 6502a
Sep 16, 2007
931
1,018
BFE, MI
The 8GB explanation would mean that all iPhone16 models will have 8GB. I find that hard to believe.

If Apple “intelligence” doesn’t run on all iphone 16s, it would be a travesty. So something is stinkawith.
 

Paddle1

macrumors 603
May 1, 2013
5,140
3,572
The 8GB explanation would mean that all iPhone16 models will have 8GB. I find that hard to believe.

If Apple “intelligence” doesn’t run on all iphone 16s, it would be a travesty. So something is stinkawith.
Why is that hard to believe? The iPhone 11 series all had 4GB, and the iPhone 14 series all had 6GB.
 

Verander

macrumors newbie
Jun 11, 2024
18
6
As a base iPhone 15 user, I would rather have the AI go to the servers and actually be useful rather than spend a lot of money so I can get away from an idiotic version of Siri. The RAM shortage is just bad foresight on Apple's part in my own personal opinion.
 

primarycolors

macrumors 6502
Oct 17, 2015
327
527
CA
The promise of a better Siri on new devices does not fix the unusable and unacceptable current state of Siri. I certainly hope the current Siri will see some improvement for the (vast majority of) devices who will not get the new one.

My Siri on CarPlay doesn't even understand how to pause music...
It's insane to me that my 3-year-old M1 iMac can run Apple Intelligence but not my 6-month-old iPhone 15.
 
  • Like
  • Sad
Reactions: fl010 and bmac4

bmac4

Suspended
Feb 14, 2013
4,885
1,877
Atlanta Ga
Everything I am reading says LLM takes up anywhere from 2-4GB of RAM normally. That seems like a pretty hefty load on the device.

So, a couple of theories I have. First is Apple is doing some kind of magic with AI, and the resources aren't nearly that high especially on mobil. Other theory is that Apple's claim of "most" request are on device is based on Mac numbers, not legacy iPad and 15 pros. This would make a lot of sense.

That said, in either case the 15 and 15 plus under either of these theories would be able to run AI. Also included would be at least the 14s and 14pros, but no reason not to believe the 13 series wouldn't be able to as well.

That leads me back to what's been said around the forums. AI will be in beta all the way up to or even beyond the release of the 16 series. I have to believe it will launch right before the 16 event, or the same day they launch. There is possibility that Apple has tested it enough to know that these other devices can run AI, and they at least release a version of it that is able to be run on those devices. Its not unheard of from Apple.
 

Wizec

macrumors 6502a
Jun 30, 2019
675
741
Since I have no interest at all in “Apple Intelligence” features which will reduce memory available for other uses and eat up battery, this almost becomes an argument to purchase a non-Pro device.
 

Ansath

Cancelled
Jun 9, 2018
4,791
5,249
Everything I am reading says LLM takes up anywhere from 2-4GB of RAM normally. That seems like a pretty hefty load on the device.

So, a couple of theories I have. First is Apple is doing some kind of magic with AI, and the resources aren't nearly that high especially on mobil. Other theory is that Apple's claim of "most" request are on device is based on Mac numbers, not legacy iPad and 15 pros. This would make a lot of sense.

That said, in either case the 15 and 15 plus under either of these theories would be able to run AI. Also included would be at least the 14s and 14pros, but no reason not to believe the 13 series wouldn't be able to as well.

That leads me back to what's been said around the forums. AI will be in beta all the way up to or even beyond the release of the 16 series. I have to believe it will launch right before the 16 event, or the same day they launch. There is possibility that Apple has tested it enough to know that these other devices can run AI, and they at least release a version of it that is able to be run on those devices. Its not unheard of from Apple.

Have you ever actually dealt with LLMs? As from your post, I’m guessing not. You’re off base with things. The complexity of an LLM needed for what is going to run on device, is 8Gb minimum.

Extremely simple LLMs can run on less, so that’s probably where you’re seeing 2-4GB RAM.

Also, Apple have said, and shown, that they are doing almost everything on device. The LLM on device works out when it needs to hand off to the cloud, which in itself is complex, so it needs that power.

That ‘almost everything is on device’ applies to the phones too, so the fact you’re saying you ‘think’ that phrase is only for macs and not iPhone 15 Pros or legacy iPads just shows that you’ve not watched anything from WWDC, like the SOTU session, for instance.

Also, from what I know, all the Android devices that run AI are 8GB RAM too, so it’s not like Apple only having it on devices with 8GB or more is just Apple.
 
  • Like
Reactions: TimFL1

bmac4

Suspended
Feb 14, 2013
4,885
1,877
Atlanta Ga
Have you ever actually dealt with LLMs? As from your post, I’m guessing not. You’re off base with things. The complexity of an LLM needed for what is going to run on device, is 8Gb minimum.
I guess you just proved my point. 8gb is minimum, and that is all the 15 pros have. So everything on the device seems pretty taxing. 😏

And no I am pretty new to LLM tech, but I am reading creditable articles and authors on the subject.

Edit: And yes I did watch the entire keynote. I don't just take everything Apple claims in a marketing presentation as what we will experience in the real world. Still got lots of beta testing to go. So you may have some knowledge on LLM, but that doesn't mean all that is true for Apple's version. For all we know Apple just needed an iPhone to run AI and to beta test, and the only option for the 15 pros. It's not like Apple has been working on this particular AI for years. This was recent.
 
Last edited by a moderator:

Ansath

Cancelled
Jun 9, 2018
4,791
5,249
I guess you just proved my point. 8gb is minimum, and that is all the 15 pros have. So everything on the device seems pretty taxing. 😏

And no I am pretty new to LLM tech, but I am reading creditable articles and authors on the subject.
Requiring 8GB RAM doesn’t mean it’s using the full of it all the time…..

Watch the WWDC sessions, there’s a lot of information on how it’s actually working and what it’s doing, etc.
 
Last edited by a moderator:

bmac4

Suspended
Feb 14, 2013
4,885
1,877
Atlanta Ga
Requiring 8GB RAM doesn’t mean it’s using the full of it all the time…..

Watch the WWDC sessions, there’s a lot of information on how it’s actually working and what it’s doing, etc.
I never said it was using the full 8gb of Ram. Honestly I would hope that it never does because that would slow the phone to a craw. I am indeed saying I fear that to run this all the time on device, when task really start to ramp up, the 8gb isn't gonna be enough. That's my entire point why I don't believe the 15 pros are Apple's answer to AI. This was just the beginning to get them running.

If I have time I will check out some of it.
 

Verander

macrumors newbie
Jun 11, 2024
18
6
The promise of a better Siri on new devices does not fix the unusable and unacceptable current state of Siri. I certainly hope the current Siri will see some improvement for the (vast majority of) devices who will not get the new one.

My Siri on CarPlay doesn't even understand how to pause music...
It's insane to me that my 3-year-old M1 iMac can run Apple Intelligence but not my 6-month-old iPhone 15.
I agree. Just because the other iPhones don't have enough RAM to run it, it doesn't fix the unacceptable state of the current Siri. I am new to tech, but my question is couldn't the non-15 Pros just send everything to either the Private Cloud Compute servers or ask Chat-GPT?
 

kiranmk2

macrumors 68000
Oct 4, 2008
1,658
2,272
Yea, exactly. I was surprised the non-pros only had 6GB RAM last year, was poor planning/design by Apple.

Surely Apple knew this was coming. Why limit the non-pro 15 to 6GB if Apple Intelligence requires at least 8GB? I think there will be backlash when this stuff goes live and people with 2 year old phones can’t use much (or any) of it.
Or was it a cunning plan to get more people to upgrade...?
I suspect that maybe they thought they'd be able to get it to run on 6GB by the time they were ready to roll it out. Bearing in mind when the iPhone 15 hardware was finalised was well over a year ago.

It seems like a big miscalculation on Apple’s part to build their phones with the most advanced chips around and then hamstring them with insufficient RAM.

Surely both models of the iPhone 16 will have 12GB of RAM of more?? Since AI is here to stay, surely they won’t put the bare minimum 8GB in these new phones??
I think it's more likely that the iPhone 15 /plus 6 GB RAM was finalised before Apple actually realised that genAI was going to be a big thing and they had to catch up quickly

In terms of what happens next, Apple historically hasn't deviated from these kind of decisions. However, the time they did was in the recent history with the Stage Manager feature which they initially limited to M1 iPad Pros due to "user experience" and then backtracked to add it to 12X/Z iPad Pros, so I wouldn't be surprised if a server-based AI came to more devices.
 
  • Like
Reactions: PeLaNo

Ansath

Cancelled
Jun 9, 2018
4,791
5,249
I agree. Just because the other iPhones don't have enough RAM to run it, it doesn't fix the unacceptable state of the current Siri. I am new to tech, but my question is couldn't the non-15 Pros just send everything to either the Private Cloud Compute servers or ask Chat-GPT?
It’s not as simple as that really, not with how it’s all been built.

It’s pants that basic Siri is staying the way it is, from what we know. Hopefully it gets some improvements, but I suspect they just won’t dedicate the resources to it, as the priority will be the Apple intelligence version for the long term.
 
  • Like
Reactions: Tagbert

Verander

macrumors newbie
Jun 11, 2024
18
6
It’s not as simple as that really, not with how it’s all been built.

It’s pants that basic Siri is staying the way it is, from what we know. Hopefully it gets some improvements, but I suspect they just won’t dedicate the resources to it, as the priority will be the Apple intelligence version for the long term.
That sucks. But thank you! One more question: Would you recommend an M3 Macbook Air for gaming? Especially since the Macbooks have all the AI now, which could be useful for coding and etc.
 

UliBaer

macrumors 6502
Feb 10, 2024
303
575
Germany
Interestingly everyone says 8GB is enough to run AI, but what is with all the other apps usually running on your phone? If you want to use your phone the same way you are "used to", you'll have to have *additional* 8GB only for AI or the phone is constantly "swapping" memory like the 8GB macs short of memory.
 
  • Like
Reactions: redbeard331

eoblaed

macrumors 68040
Apr 21, 2010
3,087
3,202
Apple has made the artificial decision to limit (at least for now) their AI features to devices with certain hardware—not because it's needed to run things (since they admitted in the keynote that many AI features will do cloud processing), but because they can.

This is not an artificial decision. The NPU isn't the only factor at work, there's the RAM, battery size, thermal envelope, and more. I know everyone wants to engage in a lot of hand-wringing, but as I pointed out in another post, Apple was basically backed into a corner with this AI release and had to get something out the door *quickly*. Undoubtedly this is why there aren't a ton of new features, other than Apple Intelligence, in iOS 18; they almost certainly cut a bunch of features and diverted all attention to Intelligence. They obviously would've preferred to take a couple of years to really get this feature set polished and released with an iOS release (instead of months later) -- this would've also built up a buffer of a few older models that could run Apple Intelligence without bringing the device, and its battery, to its knees.

This is an uncharacteristic move by Apple and one that's not an attempt to screw their user base (a company doesn't get to the size and reputation of Apple by doing that), but rather motivated by an unprecedented, rapidly moving paradigm shift happening across the industry.

As @Ansath pointed out, the common factor between all AI-supporting devices is 8 GB (or more) of RAM. However, that doesn't change the fact that Apple said AI processing will often happen off-device in many cases

Apple didn't say that. They explicitly said, multiple times, that most of the Intelligence stuff is done on-device... both for privacy and latency reasons. The things that they have to farm to the private cloud compute is likely going to be sitting side by side with the on-device modalities. As someone that's built out systems like this before, it's unlikely the features are split cleanly between on and off device, but more likely the entire system is a blend of both, making it impractical to offer just the 'off device' features to the older hardware.
 
  • Like
Reactions: PeLaNo and Tagbert

redheeler

macrumors G3
Oct 17, 2014
8,583
9,180
Colorado, USA
This is also pretty common knowledge in AI field. The LLM is stored as a model - you can go browse ollama and look at the sizes of other publicly available models that you can run on your computer locally. The smallest practically useful models are in the neighborhood of 4GB or so. The whole model gets loaded into RAM when it runs. Even if Apple developed a super efficient LLM model, it likely takes up at least 1.5-2GB or so as a reasonable guess. Factor in the OS and everything else your phone is doing and you can start to see 6GB is not enough.
If this is true then it's going to be a nightmare for people with the 8 GB RAM Macs who want to use these features. 8 GB is bad enough right now without a quarter of it used up by the new features. Apple's stinginess with RAM has shortened the life of a lot of devices very needlessly.
 
  • Like
Reactions: PeLaNo

redheeler

macrumors G3
Oct 17, 2014
8,583
9,180
Colorado, USA
The most frustrating part of this is that we’ve told for years how RAM isn’t that important and that Apple doesn’t need higher amounts of RAM on its devices because of how optimized iOS is. And now Apple has come across a technology that needs the power that only the premium version of a 9-month old device can handle - and barely at that from what it sounds like.

It seems like a big miscalculation on Apple’s part to build their phones with the most advanced chips around and then hamstring them with insufficient RAM.

Surely both models of the iPhone 16 will have 12GB of RAM of more?? Since AI is here to stay, surely they won’t put the bare minimum 8GB in these new phones??
Apple has always told us that RAM isn't important but undercut on it to the point that devices lost years of usable life, this isn't a new thing. I remember what happened with the iPhone 6 having the same amount of RAM as the iPhone 5 at a time when the iOS and apps were rapidly becoming more demanding, and when we finally got a RAM increase in the 6s it turned out to make a big difference right away. Apps reloading less often was a difference even a casual user could feel.
 

SmugMaverick

macrumors 6502a
Aug 31, 2017
904
2,580
UK
Sorry to ruin all your research but its all about RAM

Google got a HUGE backlash for announcing Gemini nano for the Pixel 8 Pro because the Pixel 8 and 8a also had 8GB RAM which is needed for on device AI.

They quickly changed course and now it's available on all 3 phones and the Pixel 9 series of phones will have 16GB RAM.

Apple did not plan this well at all and the lack of AI on the 15 and 15 plus is proof they were never going to release AI on this years phones, they were definitely pushed into announcing this earlier than they wanted.

It's been known for a while that you need 8GB RAM minimum and even the 16 series is still on 8GB RAM according to leaks, expect the 17 series to have 12GB.

Basically, Apple have been disgustingly greedy for years with RAM and now its a slap in the face of 15 owners, they deserve far more media scrutiny over this but then tech media is scared of not getting invites, t shirts, pins and selfies with Tim.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.