Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How's the "iPhone users only care about experience, not pointless specs" working out for ya?
The vast majority don’t?

Expecting every device to work with the bleeding edge of AI is kind of a silly thing. Can’t exactly go back and change a chip from 4 years ago to support something tomorrow. Nor do you want to inhibit what you can do today.
 
  • Like
Reactions: Tagbert
The vast majority don’t?

Expecting every device to work with the bleeding edge of AI is kind of a silly thing. Can’t exactly go back and change a chip from 4 years ago to support something tomorrow. Nor do you want to inhibit what you can do today.
the topic is RAM. Apple was still shipping only 6gb in their flagship in 2022. Samsung was shipping 8-12GB back in 2020 in theirs.

working out great for Apple and their shareholders, though!
 
  • Love
Reactions: applepotato666
Terrible idea and opening massive privacy risk integrating chatgpt into siri, millions of people queries with potential sensitive info will be sent to openAI servers, and I'm pretty sure openAI stores all chat logs for further training.

I used chatgpt maybe couple times and found it useless (maybe only useful for cheating on schoolwork), this is adding a ton of useless bloat no one needs or asked for.
 
Last edited:
Wonder if HomePod will benefit at all from the improved Siri..? I can choose to not use Siri on my Macs, iPad and iPhone to keep my blood pressure down, but it's not as easy to opt out when using the HomePod.
 
. Apple doesn’t buy servers, switches or anything from anyone. At their size, it’s all in house.
Are you saying they manufacturer their own switches, as opposed to contracting others to build them??
 
the topic is RAM. Apple was still shipping only 6gb in their flagship in 2022. Samsung was shipping 8-12GB back in 2020 in theirs.

working out great for Apple and their shareholders, though!
the topic is RAM. Apple was still shipping only 6gb in their flagship in 2022. Samsung was shipping 8-12GB back in 2020 in theirs.

working out great for Apple and their shareholders, though!

Apples to Oranges. Different ram. Different architecture. Different language. Everything running on iOS is compiled languages like ObjectiveC and Swift. The memory footprint is exceedingly efficient and fast. Contrast to how much of the Android world runs in the JVM. Like saying an F350 is faster than a Mustang because it has more power.

And to your point, consumers don’t care about this. It has no bearing on anything. It’s easier to say AI works with iPhone 15 vs you need this CPU and this RAM.
 
Are you saying they manufacturer their own switches, as opposed to contracting others to build them??
They use an ODM. They use the same supply chain that makes their laptops to make servers and switches. The OS, design and firmware are in house. Even network cards.

None of the companies like Dell Cisco HP make their stuff. It’s all Quanta, Wistron, Huawei, Foxconn, etc.
 
  • Like
Reactions: ric22
Terrible idea and opening massive privacy risk integrating chatgpt into siri, millions of people queries with potential sensitive info will be sent to openAI servers, and I'm pretty sure openAI stores all chat logs for further training.

I used chatgpt maybe couple times and found it useless (maybe only useful for cheating on schoolwork), this is adding a ton of useless bloat no one needs or asked for.
It’s not that. You are using Apple intelligence. ChatGPT is for things that you would google. The Internal AI deals with privacy related queries. If you ask for like a recipe or how to make your concrete set faster, you can choose to send that to Chat GPT with a confirmation dialog.
 
And to your point, consumers don’t care about this. It has no bearing on anything. It’s easier to say AI works with iPhone 15 vs you need this CPU and this RAM.

Has no bearing on anything? You have one iPhone model that supports AI because... reasons.

Yeah, it's much easier not to explain how Apple being cheapskate screwed over their customers. Much less discomforting that way. Let's pretend it's just an unfortunate turn of events and not decisions made by Apple. And we'll never get any clarification or explanation from them, either, leaving people to argue in circles instead
 
  • Like
Reactions: applepotato666
It's literally a carbon copy of what Google and Microsoft have had for years now. Wow. The good times at Apple are over for good. They have no more ideas.
I don't think that any company has transcribed user data into a vector database on the local device, then set up a simple LLM to selectively feed just the right vectors to a much larger LLM running on a cloud. That is not easy. That is not even close to easy. I have played with that myself. Doing so on a cellphone without having it melt or burst into flames is a significant challenge. Just converting the data into vectors could quite likely drain the battery in less time than it takes to sneeze.
 
  • Like
Reactions: dgdosen
RAM is everything in an LLM. Context. The more things you can load into memory, the more useful the AI. The RAM is why the phones are new only, not compute.
I suspect that context memory will all be in the cloud. The local device will take local data, probably convert it into vectors, then feed the cloud hosted GPT instance the data that it needs. The remote host will run a Langchain tool that requests the data from the small LLM on the local device. That data will go into context in one way or another.

You need context memory to be on the device doing the actual computation.
 
I wonder if the mapping data is held under proprietary/copyright protection? Apple would have to pay a lot of money for access to potential compeitors' data. Apple Maps and Google Maps are separate for driving directions...
There are open source topo maps and licensed ones. Anyway I think that Apple Maps already encompass many layers of proprietary data (for routes, traffic data, 3d buildings etc) so...if even third party apps in the Apple Store provide topo maps without breaking a sweat I really don't understand why Apple can't afford to reach a deal to embed topo maps for the whole Earth.
 
For those who have been trying to say that Apple is using ChatGPT for all of Apple Intelligence, MKBHD has officially confirmed from Apple that everything on device I happening from their own models and anything that goes beyond Apple’s models will go to ChatGPT.
 
  • Like
Reactions: jmpstar
The 15 Pro series is the only iPhone series, to date, with at least 8GB RAM.
And it is Apple themselves who decided to not give the 15 the 8GB so I don’t get the point of this. Galaxy S10 had 8GB of RAM, 5 years ago, at a lower price point, and it was a beautiful device for its time that had no compromises when it comes to internals, camera, screen and so on. Apple makes the decisions for both the product and the software it runs, so why skim on the RAM when it's going to be an issue?
 
  • Like
Reactions: Atog
I can see why more ram will be needed on future iPhones if this is the start of what they are doing. I'm sure they will bring some new things for the 16 pros as well relating to al.

hopefully it can help the image processing for the 16 series as well.
 
  • Like
Reactions: Tagbert
Just finished reading, i am impressed by the very little i understood.
Technologically Apple Intelligence seems to be mostly the same thing all other companies have been doing with large language models.

The biggest takeaways for me are:

1. There is a small model that will run on device for specific simpler tasks, which is good in my opinion.
2. Apple's "innovation" in this space is mostly about integrating it strategically into the OS.

One of the biggest problems of large models like ChatGPT 4 is, that it is very hard for a novice to discover what it is capable off. It's mostly just a text box and as a user you are supposed to guess what to do with it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.