Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

robertpolson

macrumors regular
Original poster
Feb 24, 2011
162
152
From what I understand, Apple Intelligence requires a CPU, which is beyond what Homepod can offer. So that does that mean that all Homepdos are stuck with "dumb Siri"?
 
I have the same question about whether Siri on HomePod will benefit from this year’s updates. I’m so sick of “I found some web results, I can show them if you ask again from your iPhone”.
This is even more frustrating because I recall Siri actually providing responses when I first bought my HomePod Mini in 2020. Somewhere along the line, Apple seems to have disabled most of the knowledge functionality… It would be awesome if they could utilize Private Cloud Compute to handle general knowledge questions since HomePod doesn’t have the hardware to run Apple Intelligence natively.
 
I was expecting on device SIRI. I had an Echo Dot 5, on device Alexa, works brilliant. Sold it and bought a HomePod Mini. SIRI is completely useless. It can't even set simple timers half the time.
 
From what I understand, Apple Intelligence requires a CPU, which is beyond what Homepod can offer. So that does that mean that all Homepdos are stuck with "dumb Siri"?
What's not clear to me is if the ChatGPT portion of Siri in iOS 18 will work with older devices or HomePods. Apple Intelligence is all about making information on a device accessible to local AI tools, which obviously makes use of fast hardware, but requests handled by ChatGPT shouldn't require significantly more processing power beyond what Siri of today does.
 
What's not clear to me is if the ChatGPT portion of Siri in iOS 18 will work with older devices or HomePods. Apple Intelligence is all about making information on a device accessible to local AI tools, which obviously makes use of fast hardware, but requests handled by ChatGPT shouldn't require significantly more processing power beyond what Siri of today does.
Apple Intelligence, they can't even get SIRI to set timers properly yet. After using Alexa, Google and SIRI, my expectations are super low for AI.
 
The HomePod has only two jobs. Play audio and respond to requests. It barely has a display. Why is it taking Apple so long to get this right? With the iPhone, iPad, Mac and even Apple Watch amongst devices in their lineup, the HomePod should’ve been a breeze for them to get right, or am I missing something?
 
HomePod Siri is already a frustrating experience. I'm not at all confident that AI would be an improvement. I don't really want additional interpretations of "Hey Siri, close the garage door"

As it is, all it would take is an artist releasing a song called "Close the garage door!" and we'd get music instead of the execution of a HomeKit command.

LLMs will probably make that 10 times worse. The same command may be interpreted slightly differently every time, and when it comes to smart assistants, I think I'd prefer predictability and consistency.
 
I think we'll see Siri with the natural language LLM make it to HomePod (and Apple Watch) after the Apple Intelligence launch on their primary and most valuable SKUs iPhones, Macs and iPads.

The original Siri and largely the system that's still in place, relies on a Semantic Index with pre-programmed answers manually maintained by Apple. Replacing it with their LLM for understanding speech would substantially improve Siri on HomePod, even without adding Apple Intelligence that builds your personal Semantic Index based on your data.

Improving just Siri's speech recognition isn't worth mentioning in a keynote, and honestly Apple doesn't want to draw attention to their worst products, so I think it'll just be added quietly. This would enable something like this:

"Siri, turn on the bedroom lights, turn off the kitchen, make the living room red, lock the door and close the garage"

Instead of...
Siri, turn on the bedroom lights... wait.
Siri, turn off the kitchen lights... wait.
Siri, make the living room lights red... wait.
Siri, lock the front door... wait.
Siri, close the garage door... wait.

Further down the line, probably as soon as next year after the OS launches are put to bed, I think we'll see Private Cloud Compute enabling iCloud+ with Apple Intelligence. Your data on iCloud is already sandboxed and Private Cloud Compute seems built to work in that sandboxed area. Apple Intelligence can build a Semantic Index from your email, messages, contacts, calendar and photos that are in iCloud and work with HomePod and an untethered Apple Watch. Given the server processing cost, which they're avoiding by having Apple Intelligence on device on those capable, I would expect that this would be included in a paid iCloud+ plan like Private Relay and other premium iCloud services.

I'm honestly just glad that this is finally being paid attention to. It's been a decade of watching Siri decline and being largely ignored, year after year. It's now at the top of their priorities so we'll see the improvements over time.
 
I think we'll see Siri with the natural language LLM make it to HomePod (and Apple Watch) after the Apple Intelligence launch on their primary and most valuable SKUs iPhones, Macs and iPads.

The original Siri and largely the system that's still in place, relies on a Semantic Index with pre-programmed answers manually maintained by Apple. Replacing it with their LLM for understanding speech would substantially improve Siri on HomePod, even without adding Apple Intelligence that builds your personal Semantic Index based on your data.

Improving just Siri's speech recognition isn't worth mentioning in a keynote, and honestly Apple doesn't want to draw attention to their worst products, so I think it'll just be added quietly. This would enable something like this:

"Siri, turn on the bedroom lights, turn off the kitchen, make the living room red, lock the door and close the garage"

Instead of...
Siri, turn on the bedroom lights... wait.
Siri, turn off the kitchen lights... wait.
Siri, make the living room lights red... wait.
Siri, lock the front door... wait.
Siri, close the garage door... wait.

Further down the line, probably as soon as next year after the OS launches are put to bed, I think we'll see Private Cloud Compute enabling iCloud+ with Apple Intelligence. Your data on iCloud is already sandboxed and Private Cloud Compute seems built to work in that sandboxed area. Apple Intelligence can build a Semantic Index from your email, messages, contacts, calendar and photos that are in iCloud and work with HomePod and an untethered Apple Watch. Given the server processing cost, which they're avoiding by having Apple Intelligence on device on those capable, I would expect that this would be included in a paid iCloud+ plan like Private Relay and other premium iCloud services.

I'm honestly just glad that this is finally being paid attention to. It's been a decade of watching Siri decline and being largely ignored, year after year. It's now at the top of their priorities so we'll see the improvements over time.

It would be wonderful if it happens, but I have a suspicion they'll release a newer version of the HomePods and have you buy something new in order to get the benefits of LLM. Hopeful that you are correct.
 
All the devices with hidden Thread radios are capable of Apple Intelligence, so maybe that has something to do with them processing the data for Homepod. Updated Apple TV with fanless M-series chip.
 
  • Like
Reactions: b17777
The HomePod has only two jobs. Play audio and respond to requests. It barely has a display. Why is it taking Apple so long to get this right? With the iPhone, iPad, Mac and even Apple Watch amongst devices in their lineup, the HomePod should’ve been a breeze for them to get right, or am I missing something?
Especially considering they resurrected the HomePod just last year. Is Apple intelligence something they just began development on in Q1 of this year. I mean give me a break. I’m trying to find the balance between “moving on with my life” but also being super pissed because smart home integration is my day to day. I trust Amazon less than I trust Apple. I want to ditch Alexa. But I don’t want to loose quick answers to questions while cooking or moving about the house. Yeah it’s first world problems blablabla but then why am I buying their ridiculously over priced paper weight that includes a nice speaker. Get it together, Apple.
 
All the devices with hidden Thread radios are capable of Apple Intelligence, so maybe that has something to do with them processing the data for Homepod. Updated Apple TV with fanless M-series chip.
As hopeful as that sounds, Thread doesn't have much of anything to do with AI.
 
It would be wonderful if it happens, but I have a suspicion they'll release a newer version of the HomePods and have you buy something new in order to get the benefits of LLM. Hopeful that you are correct.

They're trying to get the cost of HomePods down to build an installed base that gives them a foothold in the home. A minimum of an a17 chip is going to increase HomePods' cost. They'd also remove every existing HomePod from Siri or end up having dumb-Siri stay behind and represent the brand during a time when they're trying to repair Siri's reputation.

Another point is that Apple Watches are unlikely to get an Apple Intelligence capable chip any time soon. The battery on the Watch is too small and already struggling to keep up with a day of use.

Apple built Private Cloud Compute exactly for these scenarios. Any device with a less powerful chip like HomePods, Apple Watch, Apple TV and potentially AirPods can send requests to iCloud for compute. Apple might exclude some of the most resource intense data like image analysis – which aren't needed on the Watch or HomePod – but messages, email, contacts and calendar are fairly straightforward to analyze in a per-account sandboxed area of iCloud and return useful information.
 
As hopeful as that sounds, Thread doesn't have much of anything to do with AI.

It could. I've mentioned this around the Forum that I suspect that Apple has been assembling technologies that could in the future be used to build a hive mind capable of transforming every Apple device in proximity into a city sized super computer capable of processing AI calculations, with each node taking a small bit of the load so their individual contribution to the hive is almost imperceptible, but together they can process massive amounts of data that benefit the entire hive.

The U1 ultra-wideband chip and Thread can theoretically accomplish this by building roaming Thread networks with Apple devices joining the network as they come and go in proximity. Apple already takes this passive peer to peer approach that leverages nearby iPhones from strangers to find your AirTag. Thread would be better than WiFi as it's designed for ad-hoc, self curing networks.
 
It could. I've mentioned this around the Forum that I suspect that Apple has been assembling technologies that could in the future be used to build a hive mind capable of transforming every Apple device in proximity into a city sized super computer capable of processing AI calculations, with each node taking a small bit of the load so their individual contribution to the hive is almost imperceptible, but together they can process massive amounts of data that benefit the entire hive.

The U1 ultra-wideband chip and Thread can theoretically accomplish this by building roaming Thread networks with Apple devices joining the network as they come and go in proximity. Apple already takes this passive peer to peer approach that leverages nearby iPhones from strangers to find your AirTag. Thread would be better than WiFi as it's designed for ad-hoc, self curing networks.
When you put it like that, it does make sense. Having a private, home-based "cloud" sounds so cool lol

They're trying to get the cost of HomePods down to build an installed base that gives them a foothold in the home. A minimum of an a17 chip is going to increase HomePods' cost. They'd also remove every existing HomePod from Siri or end up having dumb-Siri stay behind and represent the brand during a time when they're trying to repair Siri's reputation.

Another point is that Apple Watches are unlikely to get an Apple Intelligence capable chip any time soon. The battery on the Watch is too small and already struggling to keep up with a day of use.

Apple built Private Cloud Compute exactly for these scenarios. Any device with a less powerful chip like HomePods, Apple Watch, Apple TV and potentially AirPods can send requests to iCloud for compute. Apple might exclude some of the most resource intense data like image analysis – which aren't needed on the Watch or HomePod – but messages, email, contacts and calendar are fairly straightforward to analyze in a per-account sandboxed area of iCloud and return useful information.
I'm only worried because: why wouldn't they mention this during WWDC? This would be huge news and instead they didn't mention it at all.
 
I'm only worried because: why wouldn't they mention this during WWDC? This would be huge news and instead they didn't mention it at all.

I work in comms. We call that a negative lead-up trap. During WWDC and the PR tour, they're focusing on the main story: Apple Intelligence. If they divert from the story to say that "HomePod and Apple Watch are getting a more natural sounding Siri" even without themselves pointing out the negative, it opens up a logical follow-up question: "why aren't HomePod and Apple Watch getting Apple Intelligence"?

There were plenty of these small improvements that aren't mentioned because they could detract from the main marketing goal of WWDC: talk about Apple Intelligence. Apple has the best marketing teams in the industry, bar none. Their talking points and comms strategies are very deliberate. They talk about things that set the conversation topics they want to set, and they're incredibly patient and disciplined, resisting talking about things that are planned to enter the conversation months later. There really is no company like Apple in marketing.

In terms of development and rollout timeline, the most obvious is that Siri in the cloud would first get the more natural LLM based Siri without Apple Intelligence in iCloud. They already update Siri in the background without any announcements.

My best guess is this:

WWDC: Apple Intelligence iPhone/iPad/Mac
  • September: New iPhone and Apple Watch with a focus on how Apple Watch (tethered) can be used to speak to Siri, no mention of Apple Watch not having Apple Intelligence on-board when untethered.
  • Late September: iOS 18 launches, marketing focus on Apple Intelligence. Siri in the cloud shifts to Siri LLM Semantic Index, sounds more natural and understands complex speech. No Apple Intelligence.
  • Early to mid 2025: iCloud+ with Apple Intelligence. A focus on iCloud features and a push for Apple One subscriptions. This marketing wave could come packaged with other improvements to iCloud.
 
Sounds obvious, but if Amazon can do on device Alexa with their AZ1 upwards chips, on device SIRI should be possible with Apple chips: https://www.amazon.science/blog/on-device-speech-processing-makes-alexa-faster-lower-bandwidth.

It’s not “on device Alexa”, it’s on-device speech processing. AirPods and Apple Watch have gotten it recently, so it should be possible on HomePod, but that won’t change the Siri experience so there is little benefit to HomePods which are typically always connected.

Swapping out the old cloud based Siri that was built on manually maintained Semantic Index for one that is based on a Large Language Model, will lead to a substantial improvement in Siri with conversational capabilities, even without Apple Intelligence.
 
  • Like
Reactions: b17777
It’s not “on device Alexa”, it’s on-device speech processing. AirPods and Apple Watch have gotten it recently, so it should be possible on HomePod, but that won’t change the Siri experience so there is little benefit to HomePods which are typically always connected.

Swapping out the old cloud based Siri that was built on manually maintained Semantic Index for one that is based on a Large Language Model, will lead to a substantial improvement in Siri with conversational capabilities, even without Apple Intelligence.
I disagree. It might not be on-device as in, doesn't require the cloud, but even on device speed processing makes a huge difference.

The difference between using the Echo Dot 5 and HomePod mini is huge. The speed at which the echo responded, and with accuracy as well, more better to use. Alexa would recognise and react to requests virtually instananously.

Unlike the HomePod Mini.

Example from last night:

Alexa, set a timer for 10 mins --- done

Siri, set a timer for 10 mins .... light flashes .... then goes off
Siri, set a timer for 10 mins .... light flashes .... timer set on iPad in another room
Cancelled timer
Siri, set a timer for 10 mins .... light flashes .... then goes off
Siri, set a timer for 10 mins .... light flashes .... unable to process the request
Siri, set a timer for 10 mins .... light flashes .... then goes off
Siri, set a timer for 10 mins .... light flashes .... done

Seems pretty useless to me!

Sometimes it works straight away, but this is rare.
 
I disagree. It might not be on-device as in, doesn't require the cloud, but even on device speed processing makes a huge difference.

The difference between using the Echo Dot 5 and HomePod mini is huge. The speed at which the echo responded, and with accuracy as well, more better to use. Alexa would recognise and react to requests virtually instananously.

Unlike the HomePod Mini.

Example from last night:

Alexa, set a timer for 10 mins --- done

Siri, set a timer for 10 mins .... light flashes .... then goes off
Siri, set a timer for 10 mins .... light flashes .... timer set on iPad in another room
Cancelled timer
Siri, set a timer for 10 mins .... light flashes .... then goes off
Siri, set a timer for 10 mins .... light flashes .... unable to process the request
Siri, set a timer for 10 mins .... light flashes .... then goes off
Siri, set a timer for 10 mins .... light flashes .... done

Seems pretty useless to me!

Sometimes it works straight away, but this is rare.
Knowing how bad Siri is on the Apple Watch, I knew right away how bad it would be on the HomePod when they announced it would be sharing the same chip.
 
They're trying to get the cost of HomePods down to build an installed base that gives them a foothold in the home. A minimum of an a17 chip is going to increase HomePods' cost. They'd also remove every existing HomePod from Siri or end up having dumb-Siri stay behind and represent the brand during a time when they're trying to repair Siri's reputation.

Another point is that Apple Watches are unlikely to get an Apple Intelligence capable chip any time soon. The battery on the Watch is too small and already struggling to keep up with a day of use.

Apple built Private Cloud Compute exactly for these scenarios. Any device with a less powerful chip like HomePods, Apple Watch, Apple TV and potentially AirPods can send requests to iCloud for compute. Apple might exclude some of the most resource intense data like image analysis – which aren't needed on the Watch or HomePod – but messages, email, contacts and calendar are fairly straightforward to analyze in a per-account sandboxed area of iCloud and return useful information.
Sticking with dumb Siri is not a way to get them a foothold in the average smart home. They just released the new HomePod last year, and it’s stuck with obsolete Siri. It’s wild.
 
Sticking with dumb Siri is not a way to get them a foothold in the average smart home. They just released the new HomePod last year, and it’s stuck with obsolete Siri. It’s wild.

Which is why I think there's more to this, to be rolled out later. Siri LLM in the cloud is a straightforward deployment replacing one model with the modern LLM model enabling natural language conversation capability. No device limitation since processing happens in the cloud as it already does with HomePod.

Apple Intelligence later added to iCloud+ via Private Cloud Compute.
 
  • Like
Reactions: b17777
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.