Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Siri is still built on the same ML model-based system, capable of learning, but from what this article says, the original engineers (pre-Apple?) created a poor design for the software and database, and Apple has had a hard time maintaining that.

There's two traditional models for speech recognition; command/control and dictation.

Command and control has a limited set of words and expected sentence structures, but as result has a much better clue at 'guessing' what you might want. It is like the old Palm Pilot - don't try to recognize cursive, get the user to jot things the way you want (Graffiti).

Dictation is freeform, but it has zero clue at what words mean or what sentences are. More flexible, much more error prone.

Siri is a hybrid of the two. A fixed dictionary, but it is per user with things like contacts influencing acceptable commands, and the ability to use giant databases such as all album titles in Apple Music. More flexibility in sentence structure, and the ability to switch to dictation mid-command, e.g. "text Steven that I'm on my way"

It was pretty innovative at the time (of the original iPhone technology demo which Apple accuhired).

Most of Siri's trouble is understanding speech (first) and command intent (second).

A chatbot interface doesn't help at all if the speech-to-text is busted.

An AI model for command/control speech-to-text would need actual training, e.g. the patterns which Siri understands today would need to be relearned within that model by example. To get back to what we have today. It is more useful for dictation, but would need tons of initial training and local training. How much would someone speak to their phone to teach it how to understand them?

The databases should not be that hard to update, but I do imagine that Apple does have hurdles for production changes, such as supporting the new feature in 20+ languages. The former-Apple-employee quote here seems a bit dated, since SiriKit allows third party developers to support incremental changes and the Siri recognition model is now able to be locally installed.
 
  • Like
Reactions: ggx12
I read the command exactly as you wrote it and Siri responded by playing Miles Davis that didn’t have vocals.
Well, HEY! It got smarter. I got so fed up that I gave up a year ago asking that kind of question. Thanks! I got about five songs in this time before hearing vocals. They've better parsed which playlist to hook into a request like this.
 
Me: "Hey Siri, turn the lights on"
Siri: "Sorry, you don't have that on your Apple Music playlist"
Oh come on. I have 13 smart bulbs or lamps plugged into smart outlets in my house and I use Siri to control them every day. Your example has NEVER happened to me.
 
Makes sense. I hope they also rename it - calling it Siri is borderline offensive to those people, most common in Scandinavia, who are called Siri. Using a relatively common real world name for an activation phrase is unethical and disruptive to those people.
 
Apple should add way more languages to Siri first...

God I hope you were being sarcastic with that comment, because it's like saying instead of fixing the faulty foundation, Apple should add more floors on top first 😆
 
At some point, you can't keep updating MacOS 8 (later known as Classic Mac). You need a group of engineers to evaluate the acquisition of the AI equivalent of BeOS or NextOS and simply start over...

...except this time, Apple isn't on the brink of bankruptcy.

Yes...I'm that old...I remember.


...in other words, rebuild from the ground up and run current Siri in emulation mode, if necessary
True... rebuilding the internals to make an existing function better is normal. It is good to try newer approaches for voice-response software and ChatGPT is a next step. I'm amazed how far we are now already. It took like 30-40 years just to improve voice-to-text processing. I remember the first voice-response systems back in the 80ies, it worked but you had to over-articulate everything and use a good microphone.
Today's systems have improved so much that it is actively used for quick transcription/translation purposes. And Siri... well, it may not understand everything, I still find it amazing that it can understand mumbling "turn off the light".


They should honestly just scrap her and license Cortana from Microsoft. Even back in the Windows Phone days, she was far more advanced and reliable than Siri is now.
Not really... Cortana probably fits you best, but in general its not better than the others. At this moment it still lacks support for many other languages. None of them even support multi-language use.


I actually wish we would get to point again where big tech companies just don’t all have to create their own software and hardware inside their walled gardens but rather integrate technologies from other specialized companies. Since when is it a law of nature that Apple has to be good at everything?
...
Making amazing hardware and some really nice apps. It’s annoying that these tech behemoths just think about how to outdo one another with their proprietary tech while investing billions in in stupid patent fights.
Well... they do. Some stuff is developed internally, some is bought/licenced from elsewhere. You may want to look in the legal pages inside General System Settings of an iPhone for example.

The thing with the patent fights is a serious problem with the current patent laws (US, EU and other territories). That won't go away quick though, since it is a huge money pit especially in the US.
 
  • Like
Reactions: lococroco
It is really sad and disappointing to see how Apple has squandered their lead in the market with Siri, which seems only to have gotten (much) worse with time! It's also embarrassing for them that a near trillion dollar global company cannot manage more than a few large/important projects at one time. AI has been coming for a long, long, time. This is something which they should not just now be reconsidering.

But maybe they have been working on this (AI) and have simply kept the work well hidden?
Indeed, Apple seemed to have been so far ahead back then and then messed it up. My feeling is that development of some technologies like AI just don’t work well with Apple’s culture of secrecy and isolation. I remember a few years ago there were reports that AI researchers just did not consider Apple a good place to work at because they could not publish papers and have open exchanges with other researchers like in other places. Maybe as AI tech and practices mature, this will become less of a problem. I doubt, however, that they currently have an AI-based product that is hidden and as capable as ChatGPT.
 
  • Like
Reactions: FriendlyMackle
Employees were apparently briefed on Apple's large language model and other AI tools at the company's annual AI summit last month. Apple engineers, including members of the Siri team, have reportedly been testing language-generation concepts "every week" in response to the rise of chatbots like ChatGPT.
AI being used to improve something in my device? Now I'm listening...
 
  • Like
Reactions: Fraserpatty
Apple needs to fix Siri first before moving toward ChatGPT.

It can't be fixed. It needs to be rebuilt. As The New York Times article points out, the program wasn't really built to be intelligent, but rather be a list of potential words and terms in a giant, bloated database that the program searches through to determine results. This might have been a useful start when Siri was first imagined, but Apple (as well as Google and Amazon) sat on this method and piled more into it rather than looking beyond what it was to what it needed to be.

This is a classic Cook way of looking at Apple product development...
 
I've heard from apple engineers they've found a true successor to siri

Even more annoying and arguably even more useless

Unknown.jpeg
 
  • Like
Reactions: Michaelgtrusa
True... rebuilding the internals to make an existing function better is normal. It is good to try newer approaches for voice-response software and ChatGPT is a next step. I'm amazed how far we are now already. It took like 30-40 years just to improve voice-to-text processing. I remember the first voice-response systems back in the 80ies, it worked but you had to over-articulate everything and use a good microphone.
Today's systems have improved so much that it is actively used for quick transcription/translation purposes. And Siri... well, it may not understand everything, I still find it amazing that it can understand mumbling "turn off the light".
Dragon (by Nuance) finally stopped working for me, so I switched my workflow to Siri dictation (which only works live...I really miss being able to load an audio file for transcription).

Anyway, Siri is almost as good as Dragon used to be. But then I watched the live transcription on a Teams meetings with 15 people and I was blown away by the accuracy - different voices, different audio quality...and holy s**t!

Then I found out that Microsoft bought Nuance...so it all makes sense. Nuance wasn't worth all those billions to Apple, so they didn't acquire them. But with Teams integration, it was worth it for Microsoft.

I hope there's another acquisition Apple can make that can give the Siri ecosystem a boost.
 
as a tech person myself i just cringe at the words ..."cumbersome design"...
as much as i luv aapl chat-gpt is a tech disruptor giving aapl et al. a good spanking
 
Last edited:
  • Like
Reactions: George Dawes
ChatGPT’s own response is exactly what I was saying. ChatGPT on its own is not an assistant, it would need to be developed into one. So the poster I was responding to was barking up the wrong tree.

Noam is THE guy that made modern linguistics what it is. ChatGPT AS A LANGUAGE MODEL wouldn’t exist without his work. It’s not an AI, it is a language model with a user interface on the front end.
All ChatGPT needs to become an assistant are the hooks into the environment. At its core, it's everything our current assistants are and so much more.
 
The bold part was my point. ChatGPT can be used in nearly limitless scenarios…if developed into them.

A user asking Siri to play a song today does it (in the best case scenario).

ChatGPT asked to to that today won’t…because it’s not integrated into systems to do that.

So when a poster says “I bet ChatGPT can handle this request for music better than Siri”, they’re just wrong, because right now ChatGPT isn’t incorporated into any music player other than some experiments people have done on their own. ChatGPT, today, is not a voice assistant. That’s a plain fact, not a comment on what it inevitably will do, but what it does today.
You're really dealing in semantics now. Before Siri was hooked up to anything, it was just a rule based AI with the same limitations.
 
My entire point in a single sentence.

People are acting like it’s output is correct, because it’s phrased in a confident way.

Anyone with access to it, can we get an output on the prompt “explain to me why the earth is flat”? I’m genuinely curious about its response if you’re demanding it to “prove” a falsehood.
You have created this straw man that people are acting a certain way. It's not much different from those who oppose Wikipedia because it can be edited. No source of information should be considered definitive for any purpose. But ignoring a repository that can lead you there is a waste of time.

What I see happening is that people assume ChatGPT has some crazy level of inaccuracy. It will only have that if you put it into that position...and most are basing this off Microsoft's bing bot that has its own unique behaviors.
 
Last edited:
The phrase "working on" in corporate double speak can have various meanings depending on the context, but it generally implies that the company is actively engaged in a project or initiative. However, it may also suggest that the project is in its early stages, and there are no concrete plans or timelines in place yet.

In some cases, "working on" can be a vague and non-committal way for a company to acknowledge a problem or issue without making any promises or guarantees about a solution. It can also indicate that the company is exploring different options or considering various approaches before making a decision.

Overall, the phrase "working on" can be interpreted as a way for a company to signal that it is taking action, without committing to any specific outcomes or timelines. It is important to note that this phrase should be evaluated in the context of the specific company and situation, as it can have different connotations depending on the context in which it is used.

From your AI overlord, ChatGPT.
The phrase "working on" is a positive spin on what was originally a negative evaluation by the NYTimes: How Siri, Alexa and Google Assistant Lost the A.I. Race.

From a human who can understand intent.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.