With humans we don’t need guarantees, because we have something called accountability and reputation. We also understand a lot about how and why humans make errors, and so human errors make sense to us.
But LLMs routinely make errors that if made by a human would cause us to believe that human is utterly incompetent, acting in bad faith, or dangerously delusional. So we should never just shrug and say nobody’s perfect. I have to be responsible for what my product does.
Thanks for the link!