If you look at a lot of products out there that are basically what people want Siri to be (a truly integrated AI assistant), they either fall super flat (see rabbit r1) or are prohibitively expensive (see motion ai).
Additionally, I think another one people forget is that Apple positioned itself with their marketing as the last bastion of privacy (I'm emphasizing marketing, not reality here), which gets contradicted with AI's current public perception. There's also the issue of choosing whether to keep things in the cloud or on device.
One final major thing that I think Apple is considering: how could AI integration inadvertently cannibalize their app store profits? If they're able to offer AI integration that other calendar, todo, etc. apps can't, how will that affect a user's purchasing decision? If doing this would potentially eat into app store profits, then the next logical step is to create a system that allows developers to essentially have an API into the user's trained data, which is likely no small feat, especially if Apple is trying to obscure the data exposed to the app developer.
Do you regularly use chatGPT as a voice assistant?
Like, I kind of wonder what else I might want SIRI to do that isn’t hard. I would like it to occasionally summarize facts like who was the president of France in 1975, and I might use it for that, but I wonder… does Apple make any money if they add that feature? Do they _want_ people using Siri? Yes, it is actually a valuable feature for me to be able to get directions while driving or play a song while cooking. Everything else has a weird cost-benefit ratio to Apple I imagine. The things that make you think the iPhone is substantively more valuable because Siri can do them are hard.
It probably comes down to the fact that Apple likes to make money, and OpenAI isn’t quite as concerned.
It sounds like a low hanging fruit, but it's far from it.
You can make LLMs work _anecdotically_, but the brutal truth is they don't scale. Money-wise but more importantly for Apple - accuracy and reproducibility-wise.
Siri is dumb, but it's very predictibly dumb and it does those few primitive things it does reliably well with zero cost for apple.
Voice interface is a paradigm shift. They are serious UX people so they are most definitely thinking in terms of new UX. Slapping on an LLM to Siri is a high school level speech to text, text to llm, llm text to speech pipeline (this is a good AI fizz buzz). That’s not what’s holding them back, instead it’s a brand new paradigm they’re working on most likely.
People expect Siri to have deep integration and knowledge of their data, not just a generic LLM. People will also hold Apple and Siri to a higher standard to actually be correct, watch is another thing LLMs don’t care about.
They could have rushed something out, but it would fall short in these areas. I’d rather than take their time and get it right. If people want a generic LLM that hallucinates a lot, there are plenty of apps for that.
If Siri were just a little better at bridging to what was available to shortcuts and applescripts it would be a nice improvement. That feature would give Siri a little upgrade and strengthen their app ecosystem motivation to provide good scripting hooks
Siri has access to all your personal data and has to be right. If you're going to ask it when you need to be at the airport for your flight and it say 7pm and it's really 7am you are really screwed. If you're asking Copilot for an opinion about your not-so-hot take on Curtis Yarvin
If you look at a lot of products out there that are basically what people want Siri to be (a truly integrated AI assistant), they either fall super flat (see rabbit r1) or are prohibitively expensive (see motion ai).
Additionally, I think another one people forget is that Apple positioned itself with their marketing as the last bastion of privacy (I'm emphasizing marketing, not reality here), which gets contradicted with AI's current public perception. There's also the issue of choosing whether to keep things in the cloud or on device.
One final major thing that I think Apple is considering: how could AI integration inadvertently cannibalize their app store profits? If they're able to offer AI integration that other calendar, todo, etc. apps can't, how will that affect a user's purchasing decision? If doing this would potentially eat into app store profits, then the next logical step is to create a system that allows developers to essentially have an API into the user's trained data, which is likely no small feat, especially if Apple is trying to obscure the data exposed to the app developer.
It came out in the Epic Trial that 90% of App Store profit comes from in app purchases in games.
Apple could care less about yet another TODO app.
Couldn't care less, I assume
https://www.merriam-webster.com/grammar/could-couldnt-care-l...
> Correct Usage: Either
Well, let's just say I'm not a descriptivist
Do you regularly use chatGPT as a voice assistant?
Like, I kind of wonder what else I might want SIRI to do that isn’t hard. I would like it to occasionally summarize facts like who was the president of France in 1975, and I might use it for that, but I wonder… does Apple make any money if they add that feature? Do they _want_ people using Siri? Yes, it is actually a valuable feature for me to be able to get directions while driving or play a song while cooking. Everything else has a weird cost-benefit ratio to Apple I imagine. The things that make you think the iPhone is substantively more valuable because Siri can do them are hard.
It probably comes down to the fact that Apple likes to make money, and OpenAI isn’t quite as concerned.
Related:
Google shares rise on report of Apple using Gemini for Siri
https://www.cnbc.com/2025/08/22/google-shares-rise-on-report... (https://news.ycombinator.com/item?id=44994585)
It sounds like a low hanging fruit, but it's far from it.
You can make LLMs work _anecdotically_, but the brutal truth is they don't scale. Money-wise but more importantly for Apple - accuracy and reproducibility-wise.
Siri is dumb, but it's very predictibly dumb and it does those few primitive things it does reliably well with zero cost for apple.
90% of what I need Siri to do it does with fairly good accuracy.
Send a message, read messages, call X.
If you think Siri is bad... you haven't tried Amazon's Echo. It's as dumb as a box of rocks.
Voice interface is a paradigm shift. They are serious UX people so they are most definitely thinking in terms of new UX. Slapping on an LLM to Siri is a high school level speech to text, text to llm, llm text to speech pipeline (this is a good AI fizz buzz). That’s not what’s holding them back, instead it’s a brand new paradigm they’re working on most likely.
People expect Siri to have deep integration and knowledge of their data, not just a generic LLM. People will also hold Apple and Siri to a higher standard to actually be correct, watch is another thing LLMs don’t care about.
They could have rushed something out, but it would fall short in these areas. I’d rather than take their time and get it right. If people want a generic LLM that hallucinates a lot, there are plenty of apps for that.
If Siri were just a little better at bridging to what was available to shortcuts and applescripts it would be a nice improvement. That feature would give Siri a little upgrade and strengthen their app ecosystem motivation to provide good scripting hooks
Is Siri even in the same product category…?
Siri has access to all your personal data and has to be right. If you're going to ask it when you need to be at the airport for your flight and it say 7pm and it's really 7am you are really screwed. If you're asking Copilot for an opinion about your not-so-hot take on Curtis Yarvin
https://news.ycombinator.com/item?id=44980305
it doesn't really matter if it makes a mistake of that magnitude.
Because they are losers now. Apple user is loser.