I agree - this isn’t complicated. It seems to me that the issue arises from Apple’s “what-if-ism” - what if you get divorced, what if one of the kids grows up and stops speaking to you, what if the dog dies etc etc, a million different versions of which will get them bad press: “Apple told me to go and pick up my dead child’s cancer medication!” Hence it falls into the Steve Jobs “we say NO to a thousand good ideas before we say yes to one”.

And also living without it doesn’t really affect Apple’s bottom line. But yeah I wish I had an AI assistant in my iPhone which would text back my parents with what I’m doing today and reply to their needless updates I get since buying them smartphones.

Siri in general seems to be, for me at least, superfluous. The answer to most questions I ask is “I don’t know” or “I didn’t catch that” or “I can’t”. AI in general is still causing me major question marks, especially where it comes to the valuations right now on the stock market. This morning I was watching Bloomberg at the European open and noticed one of my stocks wasn’t really moving as usual, and the presenter then announced that the Nordic markets were closed today because of the Ascension Day public holiday. So I googled “is the Danish stock market open today?” and naturally Google’s AI was the top link, proudly announcing “Yes! The Danish market is open today, here are the hours yadda yadda”. I scrolled down and found the actual link to the market and it showed that, of course, the market is closed, it’s ascension day. So I asked the Google AI - “are you sure about that?” and it thought again and found out that “no, the Danish stock market is closed today. I apologise for telling you it was open without checking”. Honest to god this is the tech that’s putting Nvidia at a $5.5Trilion valuation and keeping the market at all time highs right now? A technology that makes even Google worse?

> a million different versions of which will get them bad press: “Apple told me to go and pick up my dead child’s cancer medication!”

This is a very tricky one.

>> Know that my son has a test on Thursday and hasn’t opened the revision material since Monday. A gentle nudge, not a surveillance report

This feels like a surveillance report to me. The extent to which adults should surveil their children's devices is hotly disputed. There's one faction which thinks total surveillance should be mandatory (as a solution to the age verification problem or otherwise), and others which believe that children can and should have privacy (are you absolutely sure you should be monitoring your seventeen year old's conversation with their girlfriend?).

Not to mention that it's tracking a family member's interaction with a third party. We can pre-emptively assume the school knows and approves about this one, at least.

> Track our medication schedule and ping people (or me, if someone misses a window) without turning into a clinical monitoring tool.

This feels like the sort of thing where you have weeks of meetings trying to work out whether HIPAA applies or not. It would definitely be valuable. It's also a problem if it's wrong, even if that's entirely down to user error. So people make do with the adhoc version of general purpose calendar entries.

(not to mention the period tracker use case: you want to be careful with technologies which provide an evidence trail that the government have announced they want to use against you)

> I agree - this isn’t complicated.

I disagree; it's complicated enough that even humans using paper calendars and talking to each other over dinner sometimes get this stuff wrong. It seems like it would be a nightmare to actually implement, using AI or not, and given the high error rates people are reporting in other threads here, I would have no faith in Apple's or Google's ability to actually do this.

AI is really bad at current events and the concept of "now." I have a Claude project for bouncing questions off about my daughter (<1yo) and I have her date of birth in it, with the intention that new chats would be able to infer her age.

It will get it right most of the times, but sometimes it puts her as wildly younger than she is, and once it even said she wasn't born yet so I should prepare for xyz.

The AI is one thing. But it's also about conversational assistants generally being largely superfluous for a lot of people including myself. I use the Alexa in my bedroom as an alarm clock but I'll have a backup for anything early and critical. I started using the Alexa in my kitchen as a timer mostly because I find the timer on my new range a bit clunky to operate. I'd actually rather use something with a visible countdown.

Siri is occasionally useful in the car if I'm by myself but mostly in the better than nothing sense.

I think it's fair to say that a technology Amazon was, at one point, going to fill a building in downtown Boston to further develop has been extremely underwhelming.

> It seems to me that the issue arises from Apple’s “what-if-ism”... [divorce, estrangement, grief].

I don't think it is these PR issues that cause Apple such consternation, partly as -- even as someone who lives a personal life filled with such corner cases -- I just don't think those are complex issues to solve, but mostly as Apple never seems to put much thought into corner cases like this anywhere else in their business, even when it doesn't butt up against the skewed demographics of software developers (such as how Cydia had much better handling of independent developers and joint projects than Apple's App Store still does 15 years later, and the what ifs were often fascinating to handle).

In reality, the "what ifs" that Apple gets stuck on are lower level, and can even sometimes be spun in a sympathetic light: "what if a domestic abuser manages to automate so much of your software that they essentially have persistent spyware on your device" or "what if a user scripts something to the point where their phone doesn't work quite right and constantly needs tech support" or "what if people share so much of their content with someone else that they share private information without realizing"...

...but -- as is the case with their App Store restrictions that sometimes are reasonable but almost always are not -- the truth is their implied concerns are selfish: "what if a family only buys one iPad for their two or even three kids and we lose 10% of our hardware revenue" or "what if some college roommates declare themselves a family and start sharing purchases of movies and books" or "what if kids in high school (aka, 13+) can still agree to screen time limits they can't change and then don't spend as much time engaged with their phone".

It isn't just that Apple has merely not implemented some of the stuff in this article or doesn't understand what people want: instead, as their business model (like almost all big tech business models) is inherently extractive and even a bit exploitative, their need to optimize for profit is a tradeoff against what people want, so they go out of their way (in ways that are sometimes ridiculous, such as how payments work for family sharing) to make some of these use cases so broken that it forces and/or misleads their customers into spending more (and sometimes a lot more) money to work around the otherwise-arbitrary limitations.