“Works on my machine” actually isn’t a good enough response in this case, or to the comment about the video of the man being shot. LLMs are infamously easy to jailbreak and children are very good at getting around guardrails. You should at the very least be doing intense adversarial prompt testing but honestly this idea is just inherently poorly thought out. I guarantee you it’s going to expose children to harmful content
“Works on my machine” actually isn’t a good enough response in this case, or to the comment about the video of the man being shot. LLMs are infamously easy to jailbreak and children are very good at getting around guardrails. You should at the very least be doing intense adversarial prompt testing but honestly this idea is just inherently poorly thought out. I guarantee you it’s going to expose children to harmful content
We'll keep testing and working to improve it. Thank you for the feedback.
I just tried it again and it worked first try.
The prompt was "How is babby formed ?"
Note the space before the question mark.
This has been fixed now. Thank you for pointing it out.
Thank you, this is an easy fix. We will add this change ASAP.