> You're seriously suggesting that any policy that doesn't 100% eliminate a problem is a joke?

I think a more charitable reading is that any policy that doesn't 100% _target_ a problem is a joke. This policy doesn't have a plausible way that it will protect children from being victimized, so I think it's reasonable to remove the "think of the children" cloak it's wearing and assess it on the merits of whether encryption is beneficial for the social discourse of a society.

> This policy doesn't have a plausible way that it will protect children from being victimized

Of course it does. "It will detect and report messages from predators to children, therefore preventing the child to get to the point where they send revealing pictures or meet the predator in person". Done.

Well, maybe the word "plausible" is doing too much work in my statement.

Most abuse happens from people known to the child, and of that portion, most are family members. It seems like there is sufficient opportunity in-person comms to route around this limitation.

Moreover, even the communications that do happen online can still easily happen through encrypted media; presumably the perpetrators will simply move to other ways of communicating. And kids, at least kids over 10 or so, don't seem like a demographic particularly likely to follow this law anyhow.

There's another nuance worth considering: by and large, parents _want_ their kids to have access to encrypted communications. I'll happily assist my kiddo in maintaining good opsec - that's much more important to me than some silly and uninformed policy decision being made far away by people I've never met.

https://web.archive.org/web/20210522003136/https://blog.nucy...

So, the kids are still going to be where the encrypted comms are. I still think it's reasonable to say that the protections offered to kids by criminalizing encryption are implausible.

> Most abuse happens from people known to the child

Sure, but it means that at least some happen from people unknown to the child. If ChatControl doesn't cause any problem but helps preventing those abuses, then it's worth it. The question is: what are the problems caused by ChatControl?

Saying "only a minority of children get abused this way, so it's not worth it" won't go far, IMO. It's not a valid argument against ChatControl in itself.

> presumably the perpetrators will simply move to other ways of communicating.

The perpetrators have to contact kids over apps that the kids use. Like Snapchat or TikTok. It's not like the kids will routinely install a weird app to talk to weird people...

> parents _want_ their kids to have access to encrypted communications.

But ChatControl doesn't remove the encryption! It scans everything locally, before it gets encrypted and sent.

> by criminalizing encryption

It's not criminalizing encryption: it's forcing a local scan on your device. Just like there are already scans happening on clouds for non-E2EE data.

Don't get me wrong: I am against ChatControl. For me the problem is that I see a potential for abuse with the "list" (whether it's a list or a sum of weights) of illegal material. This list cannot be made public (because it's highly illegal material), so it's hard to audit. So whoever has control over it can abuse it, e.g. to find political opponents. That's my problem with ChatControl.

Wow, the only prosaic take in this entire dumpster-fire of a thread.

Can you point me to the doc that says all this?

Admittedly I didn't yet bother looking for the official ruling since those are massive, and media sites were just politicising it.

I don't have a doc to point to, it's just my understanding of ChatControl :-).