> Most abuse happens from people known to the child

Sure, but it means that at least some happen from people unknown to the child. If ChatControl doesn't cause any problem but helps preventing those abuses, then it's worth it. The question is: what are the problems caused by ChatControl?

Saying "only a minority of children get abused this way, so it's not worth it" won't go far, IMO. It's not a valid argument against ChatControl in itself.

> presumably the perpetrators will simply move to other ways of communicating.

The perpetrators have to contact kids over apps that the kids use. Like Snapchat or TikTok. It's not like the kids will routinely install a weird app to talk to weird people...

> parents _want_ their kids to have access to encrypted communications.

But ChatControl doesn't remove the encryption! It scans everything locally, before it gets encrypted and sent.

> by criminalizing encryption

It's not criminalizing encryption: it's forcing a local scan on your device. Just like there are already scans happening on clouds for non-E2EE data.

Don't get me wrong: I am against ChatControl. For me the problem is that I see a potential for abuse with the "list" (whether it's a list or a sum of weights) of illegal material. This list cannot be made public (because it's highly illegal material), so it's hard to audit. So whoever has control over it can abuse it, e.g. to find political opponents. That's my problem with ChatControl.

Wow, the only prosaic take in this entire dumpster-fire of a thread.

Can you point me to the doc that says all this?

Admittedly I didn't yet bother looking for the official ruling since those are massive, and media sites were just politicising it.

I don't have a doc to point to, it's just my understanding of ChatControl :-).