How do you ensure 'safety' for kids talking to an LLM?

For starters, people who try to jailbreak the device get put on the naughty list.

And they get to deal with talking to Siri instead

With 60 minutes of talk time included, I kind of get the impression this isn't designed so that you can hand it to your kid and let them spend the day talking to Santa. I'm assuming the idea is that they do this in lieu of writing to Santa, and you would supervise the experience.

Also, if your eight year old is trying to jailbreak Santa, you might have bigger issues to worry about.

> Also, if your eight year old is trying to jailbreak Santa

Yea, nah

The problem will be random, unsafe responses to the unpredictable things little children will say to Santa

It says you can purchase additional minutes so there is an edge case for kids to use this too much.

I mean, if my kid were trying to jailbreak Santa at least half of me would be proud.