> SQL injection attacks are an example of the age old issue of mixing up input and instructions.
Yes, and attacks on AI are much the same. The AI gets "prompted" by something that was supposed to be inert processable data. (Or its basic conduct guidelines are overridden because the system doesn't and can't distinguish between the "system prompt" and "user prompt".)