It's becoming clearer by the day why people are incapable of using LLMs responsibly, so the only sensible response is a total ban on such activity if you hope to keep some quality and sanity in your project.