> Since LLMs are unpredictable and vulnerable to injection attacks, customer service machines must also have limited power

Haha yes. I interacted with a bank one. It was like press 5 for mortgages but with a text to speech front end.

At the end of the day the LLM can be tricked into doing anything.

I wonder if there's also less of a stigma and sense of wrongdoing about tricking an LLM versus tricking an employee

We intuitively know that an employee will be punished and may get fired if we trick them. Many of us won't try to trick human employees as a result, because we would feel bad if they had bad consequences as a result of our trickery

There is likely no such hesitation around tricking LLMs. I know I personally wouldn't feel bad about it at all. Mostly because any computerized customer service process is annoying so anything I can do to limit my time dealing with it is a win in my books

I have no moral qualms about tricking an LLM, but I would not be surprised if it gets called a violation of the CFAA in the near future.