Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As usual, investors don’t understand what they heard today. They’ll probably have to attend some technical session or ask consultant to dissect it for them.
Everyone is talking about ChatGPT, how great it is.(I use it almost every day). It is great. No question.
But chatGPT has no idea about you, your friends, family, your habits, your photos, what you like dislike etc …It has no idea about your friends birthday, their taste and what a gift they might like.

Apple Intelligence will. And that will save people a lot of time. We are bombarded with lots of information, so much information that we are becoming overloaded, and the reason we are is because we now need “assistant” (AI) to help us in our daily life. We just don’t have time to read an article. We need to summarize it.
We don’t have time to go through 1000s of photos to create a story because our memory is overloaded with images and we cannot exactly say when that photo was taken. We know it was maybe you remember the tshirt your friend was wearing.

The fact that Apple Intel will have context about you, your emails, photos, friends, what they text you, how you respond to each one of them, is going to make things just easier. It’s going to streamline our busy life and maybe for once allow us to relax.

Apple did not ignore ChatGPT. They’re incorporating it as an “option” to help answer questions in the outside world, not questions in your world. And ChatGPT has competition. Google is breathing on their neck and Mistral is not far behind.

Now, i want to see how Microsoft integrate Open AI in their OS and their mobile phone. Oh, sorry, they don’t have one. My bad. Let’s see how that works on their laptop and what kind of GPU they’ll need to run these models.

Apple has made the right move. They looked “inside” before looking outside. How can we use AI to help our customers with their devices. How can AI help them, assist them.

What i saw today, is just the beginning. And with VisionPro, or “Vision” (a cheaper version), they almost have “Jarvis”.
 
  • Like
Reactions: jmpstar
As I was watching this about an hour ago...all these AI features. I simply did not think my 13 Pro Max would be compatible. I WAS thinking 14 and newer.

Wow...15 PRO (Pro Max) and well, upcoming 16...Pro / ProMax models I guess?

Eh, I don't even own the 14 Pro and I kinda think leaving out the 14 model is pretty lame.
 
  • Like
Reactions: acuriouslad
Some cool use cases, can’t help but notice that for few of them it still feels longer to tell or type what you want it to do than actually doing it yourself.

We’ll see how this works when people start using it. And we’ll see how reliable this is, as AI is known to be…quirky, lol
I think the proof reading your emails will save a lot of time and resources. I have a very bad habit when writing to sound condescending or rough, when that’s actually not my intention. When I’m writing for work I have to re-read every email I send to make sure I don’t sound off or anything.

And the searching for photos and emails. But again, this is something that I think doesn’t need to have an iPhone 15 Pro or M1 or better to be done. But I digress…
 
Am I the only one surprised that App Intents are still a thing. If Siri is going to process all (if not most of my data) on my device, why not just give it access to all my installed apps, so I don't have to wait for a developer to add a specific item for it to see my calendar, web browser history, instagram messages, etc. so it can actually find all my information not just those that are willing to develop their app for it.
I just think this could be like Netflix on the Apple TV where they won't participate in up next and the TV App, so then it's not able to fully integrate and become a hub for my entertainment. I'd want the option to add/remove full permission or allow specific app intents if the developer has programmed in such a way to make Siri actually useful if I'm not always bound to Apple's first party apps.
 
Can I just have it automatically make all of my Slack messages more friendly? I’m just tired of dealing with people who don’t know how to do their job and follow process. It’s the only negative feedback I get on performance reviews.
 
In the Platforms State of the Union video, they said their AI cloud compute servers are running a variant of iOS. I think this is finally the explanation for that mysterious “ComputeModule” device that was spotted as a target for running iOS. (The one that some Mac Pro fans were wishing would be an expansion module for Mac Pros.)

“Private Cloud Compute is designed specifically for processing AI, privately. It runs on a new OS using a hardened subset of the foundations of iOS, based on our industry leading operating system security work.”
 
Apple “Intelligence” my butt. Apple trying to pretend they did the hard, intellectual work when this is just a repackaged ChatGPT program. It’s the same shenanigans they pulled with Apple Silicon where they tried to pretend their “designs” are the reasons their chips performed well when chip design is insignificant and about as hard intellectually as ordering pizza.

Stop taking credit for other company’s inventions!

Perhaps before you make statements like that you should watch the video or read. Apple Intelligence is not powered by ChatGPT and has nothing to do with ChatGPT. Differential privacy with AI is an incredibly hard problem to solve.

The ChatGPT bit is that you can add on and opt in to ChatGPT, which makes all the apps and app devs lives easier by having a uniform framework to Apple Intelligence, ChatGPT and others.

This is 100% Apple in house. It’s the only way to do all the personal context and entity relationships on-device.
 
  • Like
Reactions: Tagbert
What you’re describing is more akin to what fabs do. Apple does easier, high-level stuff like core count, core size, etc. it’s not hard to list a spec sheet.

In our analogy, Apple would be the customer picking the size and the toppings with barely any technical know-how (Which anyone can do) while the fab works on the harder, intricate details of the pizza.

Oh the irony of someone with a name of High IQ who has demonstrated repeatedly a complete lack of knowledge.

The chip designs and layouts are done at Apple. Everything from the GPU to NPU to the shared memory architecture and everything is Apple’s work. If we’re making analogies, they send the blueprint over to TSMC to “print” the wafers. They are not chip designers. Apple has some of the best silicon designers in the country and the only ones working in the 2nm process node and beyond.

It’s not “here’s some cores and here’s some clock rates”. That’s like saying all an architect does is say “build me 3 bedrooms”. Every facet of the silicon design, where the circuits go, how many they can pack in, the 3 dimensional relationship of things, etc is designed by Apple. How many cores a chip get isn’t designed, that’s a result of binning. They design the, TSMC prints them and when something prints on the edge of a wafer and there’s 8 cores instead of 10, they make a new model instead of throwing it away. There’s virtually no difference between i5, i7, i9 Intel except how good the silicon yield is. No one “designs” for cores. Cores are a result of physics and yield.

You may be High IQ but you are poorly informed. Fabs are not designers. They are manufacturing. Intel is going to make ARM chips for nVidia, for example. nVidia designed the chip and licensed the rights to the ARM instruction set. Intel is just 3D printing the silicon.

And let’s not forget, Apple basically bankrolls all of TSMC’s process node advancement because hardly anyone tries to be on the bleeding edge at 2nm and 3nm. The yields are so ****** the smaller you go that you are throwing away 40-50% of the silicon and Apple is the only company that can commit that kind of money and throw half the product away. Everyone else waits for Apple/TSMC to perfect 3nm and then they will work on 3nm while Apple is at 2nm or less.
 
I downloaded IOS18 on iPhone 15 Pro Max, iPad 13” M4 and Apple Watch 11 on AW Ultra 2.

Right now almost none of the functionality shown is available.

On the phone and ipad you can change the icon colors and the settings app has a slightly different layout, but that’s about it. The dark mode icon colors look off to me. Also, the icon reshade only allows a dark icon background color, looking like the reshade is only available for dark mode.

You can hide the app icon text which makes the icons slightly larger but aesthetically it doesn’t look right / icon size in relation to screen size looks crude.

Also the battery widget on home and lock screen is broken as it will not allow the selection of different devices (AW, AP, etc.)

If you are sad that you can’t test the IOS18 beta, don’t be. None of the AI features shown are available. Siri is the old ignoramus Siri. Pretty much all you can do is have the icons have a dark background or reshade them, both of which don’t look very refined.

I think the most useful things are the Apple WatchOS 11 updates “vitals” and “trends” which are available. Also the AW Modular 2 watch face has a new “Vitals” bezel option which looks like it has nodes, no idea what for.
 
Last edited:
Aren't Apple servers just stacks of Mac Minis?
No. Apple makes internal use servers that power iCloud and their 10’s of millions of square feet of data center space. Apple doesn’t buy servers, switches or anything from anyone. At their size, it’s all in house.
 
I love how Craig says these features are coming later, we’ve been around long enough to know that that means iOS 19. o_O
Wrong. These are scheduled as part of iOS 18. Some parts will ship this fall when the OS updates ship. They then do feature updates every couple of months as the features are fully ready. The App Intents integration in Siri will come as one of those updates.
 
  • Like
Reactions: JosephAW
Yet the M1 with 11 TOPS and M2 with the same 15.8 TOPS as the A15 is just fine somehow? The 2GB of less RAM would make very little difference as the neural engine is the chip doing the vast majority of the processing.
RAM is everything in an LLM. Context. The more things you can load into memory, the more useful the AI. The RAM is why the phones are new only, not compute.
 
  • Like
Reactions: atonaldenim
Am I the only one surprised that App Intents are still a thing. If Siri is going to process all (if not most of my data) on my device, why not just give it access to all my installed apps, so I don't have to wait for a developer to add a specific item for it to see my calendar, web browser history, instagram messages, etc. so it can actually find all my information not just those that are willing to develop their app for it.
I just think this could be like Netflix on the Apple TV where they won't participate in up next and the TV App, so then it's not able to fully integrate and become a hub for my entertainment. I'd want the option to add/remove full permission or allow specific app intents if the developer has programmed in such a way to make Siri actually useful if I'm not always bound to Apple's first party apps.
Without any kind of communication protocol, Siri would actually have to open the app and do everything like you would, looking at the screen and tapping. Doable but tricky. Let's say the thing you want is not a big button when you open the app but buried deep under a menu somewhere, or requires several other things to be done first, how is it going to figure it out? And how does it know it's actually doing it correctly and advancing toward the desired result? Deepmind can play complex strategy games by watching and interacting with the screen using reinforced learning, but it basically takes trial and error. Do you want to sit there and watch Siri flail around?

Conceivably Apple could train Siri to do all the major functions on the popular apps and keep the model constantly updated. But still not ideal. Say you want Siri to notify you when something happens in an app. Is it going to open the app, keep it open, and keep refreshing the screen? The better way is for Siri to have a way to just to ask the app. In your netflix example, Siri could watch you and remember your watch history. It could even put the right thumbnail in the hub. But it would not be an actual link to the movie because it doesn't know. You get to watch it scroll around trying to find it. Or search for it. And maybe get a different movie with the same name. Or not know if the movie has been pulled from the service.
 
Oh the irony of someone with a name of High IQ who has demonstrated repeatedly a complete lack of knowledge.

The chip designs and layouts are done at Apple. Everything from the GPU to NPU to the shared memory architecture and everything is Apple’s work. If we’re making analogies, they send the blueprint over to TSMC to “print” the wafers. They are not chip designers. Apple has some of the best silicon designers in the country and the only ones working in the 2nm process node and beyond.

It’s not “here’s some cores and here’s some clock rates”. That’s like saying all an architect does is say “build me 3 bedrooms”. Every facet of the silicon design, where the circuits go, how many they can pack in, the 3 dimensional relationship of things, etc is designed by Apple. How many cores a chip get isn’t designed, that’s a result of binning. They design the, TSMC prints them and when something prints on the edge of a wafer and there’s 8 cores instead of 10, they make a new model instead of throwing it away. There’s virtually no difference between i5, i7, i9 Intel except how good the silicon yield is. No one “designs” for cores. Cores are a result of physics and yield.

You may be High IQ but you are poorly informed. Fabs are not designers. They are manufacturing. Intel is going to make ARM chips for nVidia, for example. nVidia designed the chip and licensed the rights to the ARM instruction set. Intel is just 3D printing the silicon.

And let’s not forget, Apple basically bankrolls all of TSMC’s process node advancement because hardly anyone tries to be on the bleeding edge at 2nm and 3nm. The yields are so ****** the smaller you go that you are throwing away 40-50% of the silicon and Apple is the only company that can commit that kind of money and throw half the product away. Everyone else waits for Apple/TSMC to perfect 3nm and then they will work on 3nm while Apple is at 2nm or less.
It’s probably best to just put that person on the ignore list. They always post (repeatedly) the most ludicrous nonsense acting like they know so much while its is painfully obvious to those of us reading it that they haven’t got a clue. They never learn or come back with any cogent response. The name is the chef’s kiss of irony.
 
Its recall. They are indexing files on the system. How are they able to pull your driver's license information from a photo you took to fill a form on a web page in safari?
11:12 amApple Intelligence is grounded in your personal information, accessing data from around your apps and what's on your screen. Suppose a meeting is being rescheduled for the afternoon, and I'm wondering if it's going to prevent me from getting to my daughter's performance on time. It can understand who my daughter is, the details she emailed several days ago, the schedule of my meeting, and the traffic between my office and the theater.

11:22 amIncludes a semantic index of photos, calendar events, and files. Includes things like concert tickets, links your friends have shared, and more.

Recall is meant to be encrypted and run on device, but the feature is similar. How safe it is on release is a different conversation when security researchers are able to get hands on it. I wish people would stop being fanatics and see things as they are.

Android ecosystem has been doing just that for some times now. Look at Google Pixel for example with Gemini nano AI models that runs on android devices.



Windows Copilot+ PC is a collaboration between Microsoft and silicon makers to create a tightly integrated experience to enable AI models on device. You want disparate models that are good at different tasks if you want to do on device processing. It is no different from what Apple is doing.

Indexing for search is not remotely the same as Indexing for AI. Search is largely just searching an index. When Indexing for AI, you are actually dealing with filling attributes for billions of parameters. Understanding things like Mom = this person and understanding that when does Mom’s flight land means connecting what Mom is, who Mom is, where did Mom send me her flight? text? email? facebook? understanding it’s a flight, parsing out what the flight is, looking up flight data and connecting it all back to you. Apple has published papers about the work they have done in shrinking 7B+ parameter models to run on device. They are doing H100 level GPU semantic and contextual modeling and understanding on your phone.
 
  • Like
Reactions: Tagbert
Nvidia is a good salesman. The more popular you are, the more you can sell and get priority with capacity and quotes.

TSMC is the real genius. Nvidia isn’t doing anything except being a good salesman.

Apple and I guess Nvidia now are good at booking TSMC capacity and blocking out competitors from getting access to the latest nodes. There isn’t any engineering secret with what Nvidia or Apple is doing.

What Apple should be concerned about is if this AI hype is real and Nvidia can start driving higher volumes than Apple. TSMC will start prioritizing Nvidia over Apple, meaning Nvidia can start making better chips than Apple due to gaining access to the best nodes first. Nvidia cpu/SOC might be a real possibility in the future.

Wow, how can you be so dumb. Intel is making the CPU’s for nVidia. Not TSMC. And Apple is a much much much larger customer than nvidia could ever be. Do you have any fricking clue how many chips Apple makes in a year? Also Apple is one of nVidias bigger customers, no one is cutting in line in front of them.
 
What a horrible review. You basically just rewrote their press release. I say this as an Apple employee. It’s awful. Most employees think it’s awful. Apple will NEVER do AI right given our emphasis on family friendly and privacy. The actual product is way worse than this. It’s Siri 2.0.
You mean you and other apple employees already tested out this feature a lot behind the scene? What are the awful parts from the tests you guys did? And why are family friendly and privacy a road block for Apple to do AI right?
 
  • Like
Reactions: jmpstar
I did not see anything about live transcription of audio in facetime calls. The demo says record and transcribe. CoPilot has this and it would have helped with video players providing subtitles/CC for any video that is playing.
 
  • Like
Reactions: reviewspin
No. Apple makes internal use servers that power iCloud and their 10’s of millions of square feet of data center space. Apple doesn’t buy servers, switches or anything from anyone. At their size, it’s all in house.
No, Apples services and iCloud system run on AWS and Azure. We have heard that they will be running Apple Private Cloud Compute on their own hardware running M2 chips and server software based on iOS. It is an interesting solution. I like to imagine them running on stacks of iPads. :)

edit: I guess my information was out of date. As little2v pointed out in another post, they have been setting up some of their own data centers in the last few years, so they may be trying to be less dependent on hosting services like AWS.
 
Last edited:
  • Like
Reactions: reviewspin
No, Apples services and iCloud system run on AWS and Azure. We have heard that they will be running Apple Private Cloud Compute on their own hardware running M2 chips and server software based on iOS. It is an interesting solution. I like to imagine them running on stacks of iPads. :)
Negative. Only 1 iCloud stamp, not even publicly accessible is in AWS. No Azure anywhere at all. Some GCP. But 99.9 of iCloud is Apple data centers and has always been. Hundreds of exabytes and millions of servers.

AWS/GCP at Apple is for experimentation, play, GPU, extra capacity, object storage, etc. iCloud data and your personal data is not there.

The Apple Silicon stuff is nothing new. They use that for X-Code cloud for simulating Apple devices like ipads and iphones for CI/CD and builds. Think 336 SoC’s (not M2 or M4 or anything retail) in a single rack.
 
  • Disagree
Reactions: reviewspin
No, Apples services and iCloud system run on AWS and Azure. We have heard that they will be running Apple Private Cloud Compute on their own hardware running M2 chips and server software based on iOS. It is an interesting solution. I like to imagine them running on stacks of iPads. :)

If you read Apples environmental report, it’s in there. Cliff notes version: https://www.datacenterdynamics.com/...ed-more-than-23bn-kwh-of-electricity-in-2023/

Apple uses over a gigawatt of power in its campus facilities that are million+ square feet times 7. Dozens and dozens of collocations and is GCP and AWS’s largest customers in the billions. They have a **** load of infra, beyond what anyone could imagine. They gave a talk at a Cassandra event that said they had over 300,000 servers just for Cassandra and over 36,000 servers just for FoundationDB. It’s millions of servers, at the level of Google, AWS and Facebook.
 
No, Apples services and iCloud system run on AWS and Azure. We have heard that they will be running Apple Private Cloud Compute on their own hardware running M2 chips and server software based on iOS. It is an interesting solution. I like to imagine them running on stacks of iPads. :)

If you’re the nerdy type, they let media in a few of them.




 
  • Like
Reactions: Tagbert
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.