> I remember when CSAM meant actual children not computer graphics.

The "oh its photoshop" defence was an early one, which required the law to change in the uk to be "depictions" of children, so that people who talk about ebephiles don't have an out for creating/distributing illegal content.

There still needs to be sexual abuse depicted, no? Just naked kids should not be an issue, right?

If I found a folder with a hundred images of naked kids on your PC, I would report you to authorities, regardless of what pose kids are depicted in. So I guess the answer is no.

In US law it seems the definition of CSAM does not include naked minors that do not show sexually explicit conduct: https://www.justice.gov/d9/2023-06/child_sexual_abuse_materi...

Naked kid pictures intended for sexual gratification are illegal in most countries

Hard to know the intent of a picture in most cases. E.g. there used to be a magazine for teens when I grew up showing a picture of a naked adolescent of each sex in every edition (Bravo Dr Sommer). The intent was to educate teens and to make them feel less ashamed. I bet there were people who used these for sexual gratification. Should that have been a reason to ban them? I don't think so.

Educational nudity where the subject consented possible for teenagers over 16 in Germany (and the publication complied with the law) is not the same category as CSAM or non-consensual sexual imagery. In the former, misuse by a minority doesn’t automatically make the publication illegal. In the latter, the harm is intrinsic: a child cannot legally consent, and non-consensual sexual images are a direct rights violation.

Do you think the judge is stupid? The naked kid pics aren't printed in an anatomical textbook. Because they're AI hallucinations, I doubt they're even anatomically correct.