One of my favorite episodes:
I set 2 timers for the same thing somehow. I then tried to cancel one of them.
>“Siri, cancel the second timer”
“You have 2 timers running, would you like me to cancel one of them?”
>“Yes”
“Yes is an English rock band from the 70s…”
>“Siri, please cancel the timer with 2 minutes and 10 seconds on it”
“Would you like me to cancel the timer with 2 minutes and 8 seconds on it?”
>“Yes”
“Yes is an English rock band from the 70s…”
Eventually they both rang and she listened when I said stop.
My favorite is when I ask Siri to set a timer and get back "there are no timers running."
My other favorite is when I ask Siri to set a timer on my watch and it does a web search.
My favourite is when I ask siri to stop the alarm(that is currently going off) and it decides to disable my morning wake up alarm but keep the current alarm going off.
“Siri stop”
“There’s nothing to stop”
> me, suddenly aware of how the AI takeover will happen
At that point I would be very impressed if you could remember what the timers are for.
> "Stop" is a song by English girl group the Spice Girls from their second studio album, Spiceworld (1997).
Helping my kid get ready for shower I had this exchange:
Me: "Text Jane Would you mind dropping down the robe and underpants"
Siri: Sends Jane "Would you mind dropping down"
Me: rolls eyes "Text Jane robe and underpants"
Siri: "I don't see a Jane Robe in your contacts."
Me: wishes I could drown Siri in the bathtub
It's wild to me that Apple got the ability to do the actual speech-to-text part pretty much 100% solved more than half a decade ago, yet struggles in 2026 to turn streams of very simple, correctly-transcribed text into intents in ways that even a local model can figure out. Siri is good STT, a bunch of serviceable APIs that can control lots of stuff, with the digital equivalent of a brain-damaged cat sitting at the center of it guaranteeing the worst possible experience.