> But you can't.
Be careful what you wish for. We could enforce client-side scanning on the OS. Everything that appears on the screen is scanned and reported.
> If you sexually abuse children the hurdle to illegally running a non-backdoored messenger seems pretty low to me.
How do you contact children on mainstream messengers if you can't use mainstream messengers?
Not to mention that most people consuming CSAM are not technically savvy. It is known that such material goes through mainstream messengers.
> Only that surveilling your entire population minus the criminals has dire consequences for a free society.
Again: I am against ChatControl. We fundamentally cannot know what is running in this client side scanner, and that makes it a risk.
But saying that it won't ever catch a single criminal is not a valid argument. You won't convince politicians to vote against ChatControl with such claims.
> Not to mention that most people consuming CSAM are not technically savvy. It is known that such material goes through mainstream messengers.
The reason is because it works. They're not stupid - they can use signal.
The reality is that the privacy options not only exist, they're really good - often better and easier to use than the mainstream stuff.
They will just pivot to other tools.
> How do you contact children on mainstream messengers if you can't use mainstream messengers?
The scanning just doesn't include contacting children - it includes CSAM. Talking to kids isn't CSAM. You're talking about something else altogether, and something which is purely hypothetical.
> The scanning just doesn't include contacting children - it includes CSAM.
My understanding is that they are not only talking about having a list of hashes (of illegal images), but also having some kind of machine learning. And they are not only talking about scanning images, but text, too.
I don't know what you expect them to report when scanning conversations with machine learning?
> Be careful what you wish for. We could enforce client-side scanning on the OS. Everything that appears on the screen is scanned and reported.
Nope. Criminals can still just build their own devices with their own operating systems. We have existing OS without spying, people have them as ISO on their harddrives. You can't stop criminals from installing an old Lineage OS build.
Legally you can't stop anybody from using encrypted channels if they are motivate unless you go out and erase the knowledge.
Implementing filters that go on all communications is undemocratic. Any future authoritarian government can use the same filter to immediately target their opposition, not just in theory, in practise. We have designed our democracies with division of powers for the simple reason that we have learned through history that giving any single kind of entity that kind of power leads to tyranny. That means whenever we give the government new superpowers we are changing a carefully setup balance. What does the citizen get for that trade? Nothing. If your power only works as long as it is in the hands of the good ones and becomes dystopian once it gets into the hands of the bad guys, maybe that power shouldn't exist.
Since we want to obviously prevent childrem from being sexually abused the best way to start according to most child-protection organizations is to start at the root. That means educating kids early on in child-suitable ways and investing in prevention and psuchological help for potential pedophiles. If children have the possibility to signal such a thing happened to adults or other persons of trust, you don't need mass surveillance.
But my guess is that CSAM is just the symbolic reason, in reality this is meant to do more. It would be perfect to target small movements as they are emerging for example.
> Nope. Criminals can still just build their own devices
Haha sure. You over-estimate many of them. Not everyone breaking the law is a professional criminal. Especially when it comes to CSAM.
> Legally you can't stop anybody from using encrypted channels if they are motivate unless you go out and erase the knowledge.
ChatControl doesn't pretend to do this: they pretend to control mainstream social media, whether encrypted or not.
> It would be perfect to target small movements as they are emerging for example.
Yes, this is my problem with ChatControl: it's a powerful tool of surveillance that would be very dangerous if abused. We don't want to create that.
But "it's useless because criminals will compile their own kernel anyway" is an invalid argument. It doesn't help the cause.