> If there is a glut of legal, AI generated CSAM material then this provides a lot of deniability for criminal creators/spreaders that cause genuine harm, and reduces "vigilance" of prosecutors, too ("it's probably just AI generated anyway...").
> You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions.
I don't know about that. Would "I didn't know it was real" really count as a legal defense?
> I don't know about that. Would "I didn't know it was real" really count as a legal defense?
Absolutely-- prosecution would presumably need to at least show that you could have known the material was "genuine".
This could be a huge legal boon for prosecuted "direct customers" and co-perpetrators that can only be linked via shared material.
I'm really not convinced. This sounds very idealistic to me. The "justice" system is way more brutal in real life.