The scanning just doesn't include contacting children - it includes CSAM. Talking to kids isn't CSAM. You're talking about something else altogether, and something which is purely hypothetical.
The scanning just doesn't include contacting children - it includes CSAM. Talking to kids isn't CSAM. You're talking about something else altogether, and something which is purely hypothetical.
> The scanning just doesn't include contacting children - it includes CSAM.
My understanding is that they are not only talking about having a list of hashes (of illegal images), but also having some kind of machine learning. And they are not only talking about scanning images, but text, too.
I don't know what you expect them to report when scanning conversations with machine learning?