So what? If predators cannot talk to children over SnapChat, that's a win, wouldn't you say?

The only valid argument I see against ChatControl is that fundamentally, you cannot know what it is reporting. It's not like if there would be an open source list of illegal material together with the hashes, right?

If you cannot audit what is being reporting (with whatever means necessary to make sure it is doing what it should be doing), then whoever controls it could abuse it.

That's the problem. That's the reason not to implement it. But it's completely overwhelmed by the flood of invalid arguments.

> The only valid argument I see against ChatControl is that fundamentally, you cannot know what it is reporting. It's not like if there would be an open source list of illegal material together with the hashes, right?

By definition, they must state what is actually illegal, lest I be hidden laws with hidden punishments.

And those lists of 'illegal' need to be publicly disclosed, so we are aware.

At least in the USA a naked picture of someone who is 17y364d old is 'child porn', but that extra day makes it 'barely legal'. But yet, most USA jurisdictions say that 16y can have sex. Just that pictures are EVIL even if you take them yourself.

Again however, I tend to more agree with Stallman that CSAM or child porn picture possession should either be legal or have a mens area attached, and not strict possession. Its proof of a crime, and shouldn't in of itself be a crime.

But because a picture is a crime, we get these horrific laws.

> By definition, they must state what is actually illegal, lest I be hidden laws with hidden punishments.

I don't need to murder you in order to say that murdering you is illegal, do I?

Of course they don't have to publish CSAM material in order to say that this is illegal CSAM material. If you could go get CSAM material at your local library, nobody would be talking about scanning it with ChatControl...

Then from a picture, tell me what the exact age of the nude person is.

Again, we need to know the exact year and day. 17y364d is illegal but 18y is legal.

Not sure how this answers my comment above.

It points out the failing of comparing an action (killing someone without cause), and passively receiving a picture.

That picture does NOT have enough information to determine if its legal or not. And even as simple as 1 day can be the difference between legal and not.

And of course, is also thethe hypocrisy of sex being legal at 16, but pictures are 'child porn', when they are demonstrably NOT children.

Well, there are MOST DEFINITELY images that are, UNAMBIGUOUSLY, VERY illegal. It is not hypocritical to say that detecting such images one a device means that someone should look into it, because something got VERY WRONG there.

So yes, MANY pictures already exist that do have more than enough information to determine that they are VERY ILLEGAL.

If you can't apprehend that, I don't know what to tell you.

I think that a world where underage children can't access tik tok and snapchat is an acceptable cost to keep our rights for privacy.

> The only valid argument

Really? The only one?

Really, yes. I am against ChatControl myself, and I am genuinely struggling to find credible messages against it.