I'm surprised at the lack of care for children in commentary around social media bans, the UK's online safety act blocking discord/porn/etc without proof of age, and sexually violent video games being removed from stores due to payment processors.
If we see a child bullied in school, we don't say the parents of the victim are not doing enough parenting. If an adult flashes or cat calls a child in the street, we don't blame the child's parents for not doing enough parenting.
Why is it when it comes to social media/pornography/sexually violent games (that would not have received a rating if sold in stores 10 years ago) everyone is up in arms that things are going too far. All you see is conspiracy theory nonsense at how the state wants to mind control us.
Apple tried to fix some of this at the client end with CSAM scanning and automatic dick pic blurring but had to roll some of this back due to the uproar and accusations on spying etc.
We no longer live in a high trust society and children are paying the price for it, as well as adults.
I don't want my phone scanned or to be denied access to adult discord channels or have to submit an ID to visit "adult" content like home-brewing beer subreddits. But now that's the price I am having to pay because no one wants to be responsible. Especially the companies making money off of children.
But what are we supposed to do instead when everything has been shot down previously? The amount of children being groomed is probably the highest it has ever been, but because it's done in their own bedrooms over the internet everyone ignores it.
> But now that's the price I am having to pay because no one wants to be responsible. Especially the companies making money off of children.
> But what are we supposed to do instead when everything has been shot down previously?
What would "being responsible" have looked like, prior to these pushes?
> If we see a child bullied in school, we don't say the parents of the victim are not doing enough parenting. If an adult flashes or cat calls a child in the street, we don't blame the child's parents for not doing enough parenting.
Perhaps not, but we also don't abolish schools or ban kids from going outside, do we?
> The amount of children being groomed is probably the highest it has ever been, but because it's done in their own bedrooms over the internet everyone ignores it.
What does this mean, exactly? And what sort of source do you have?
> I'm surprised at the lack of care for children in commentary around social media bans
I think you're misunderstanding the pushback. I think most people are perfectly agreed that yeah, children shouldn't be exposed to some things until a certain age. The problem is the question of how we do that without utterly destroying privacy for everyone else too? If the answer is "we can't", then it might be we have to look at ways to deal with our children being exposed to those things - either, you know, don't let your kids on the parts of the Internet you don't agree with, or teach them the right morals and ethics so they learn to recognize and avoid those areas themselves.
How about phones and computers themselves can be configured as belonging to children, and services/platforms being required to respect rules when a device tells them a child is accessing their services?
In terms of YouTube all I want as a parent is (a) to ban Shorts from my kids accounts (b) be able to see a list of what they're watching. I want this respected on the app and website.
Good platforms have decent parental controls. Bad platforms don't. YouTube is just awful in that regard. Which is a shame, because there's a tone of good stuff to watch on YouTube. But it's on them to fix their platform.
Your idea being so simple, yet solving the problem better than their proposed "solution" just shows how is not about protecting children.
> Perhaps not, but we also don't abolish schools or ban kids from going outside, do we?
No but we heavily regulate schools and the behaviour of people at them. Which is more akin to what we are doing with these laws rather than just saying it's the parents fault their child is groomed or exposed to adult content without their consent.
> What does this mean, exactly? And what sort of source do you have?
It means more children are victims of sexual abuse than before, thanks to the wonders of the Internet.
> The problem is the question of how we do that without utterly destroying privacy for everyone else too? If the answer is "we can't", then it might be we have to look at ways to deal with our children being exposed to those things - either, you know, don't let your kids on the parts of the Internet you don't agree with, or teach them the right morals and ethics so they learn to recognize and avoid those areas themselves.
"It's the parents fault". My entire point here is that clearly parents are not equipped to police their own children's use of the Internet 24x7. Social media companies are doing a a bad job of it, because they want engagement and clicks not reducing usage and blocking content.
Companies have tried implementing this on the client, for example Apple and CSAM scanning, and had to roll it back because of 'privacy' concerns.
And now this is what we have to deal with. No one wants to do anything about it because of 'privacy' and yet children are still being exposed to harm.
To be clear: I don't want these laws, I don't want my life scanned, I don't want to have to submit IDs, but as a society we have obviously dropped the ball on this and now we're screwed. There are implementations that retain privacy like buying single use codes from shops in person that can be used to prove you are 18+ for online services.
Or having tokens that you can get attested/signed from a government portal that you can give back to services to prove you are 18+. That can also be designed to retain 100% privacy. (Assuming the government doesn't have access to these services through a back channel, and assuming these one use tokens are not saved by the service provider you are wanting to use.)
Personally I think blocking all kids from social media is probably one solution that doesn't get adults complaining about privacy. Unfortunately social media companies make an absolute fortune from content aimed at children so they are obviously unhappy with this.
> No but we heavily regulate schools and the behaviour of people at them. Which is more akin to what we are doing with these laws rather than just saying it's the parents fault their child is groomed or exposed to adult content without their consent.
This is not how I see it. These laws are more like what I suggested. The equivalent of that regulation in the case of the Internet would be simply not allowing whatever behavior you disagree with from social media companies and the like.
> It means more children are victims of sexual abuse than before, thanks to the wonders of the Internet.
Can you link some source on this?
> My entire point here is that clearly parents are not equipped to police their own children's use of the Internet 24x7
Neither can they "police" their child's life 24/7. Nor do have to to prevent their kids from falling into various holes out there. Take, for example, drugs and alcohol. We've, more or less, arrived at a reasonable system for keeping these out of the hands of children (most of the time) - but we also accept that, due to this system not being 100% fool-proof (and indeed that such a system could not possibly exist), sometimes kids are going to get access to drugs and alcohol. And yet, most parents (at least, that I know) would agree that that doesn't mean your kid is going to turn out to be an addict or whatnot.
My point is, that with sane regulation - that doesn't inherently erode privacy for everyone all over the world, and gives even more control to (Western) governments and companies - and parents doing their jobs, we can minimize the harm done by social media. We can't eliminate it entirely, but that's the price we pay.
(I question how much actual harm is done by social media, but that's another discussion)
>The amount of children being groomed is probably the highest it has ever been, but because it's done in their own bedrooms over the internet everyone ignores it.
Children are being groomed by high-profile paedophiles and trafficked to powerful people with no consequence, and you want to give those people more control over the internet?
You're both egregiously wrong. The vast majority of victims of grooming are victimized by people they know and trust, mostly family and authority figures.
Right. And, just to be clear, "authority figures" means people who are in a personally relevant position of authority, like a teacher, police officer, or priest - not more distant authorities like a politician or wealthy person.
He is not saying that most children are groomed by the these people. He is saying that they are doing that, we know and nothing happens because they are too powerful. So giving them more power is not a good thing.
> All you see is conspiracy theory nonsense at how the state wants to mind control us.
The online safety act is much more than a porn ID law. Look at this stuff about controlling public discourse. Even if you trust a Labour government with this power you might not trust a future Reform one.
"This clause enables the Secretary of State to give OFCOM directions in circumstances where they consider there is a threat to the health or safety of the public, or to national security.
This includes directing OFCOM to prioritise action to respond to a specific threat when exercising its media literacy functions and to require specified service providers, or providers of regulated services generally, to publicly report on what steps it is taking to respond to that threat.
For example, the Secretary of State could issue a direction during a pandemic to require OFCOM to: give priority to ensuring that health misinformation and disinformation is effectively tackled when exercising its media literacy function; and to require service providers to report on the action they are taking to address this issue." - Explanatory Notes relate to the Online Safety Bill as brought from the House of Commons on 18 January 2023; Clause 156 - Section (8), available https://bills.parliament.uk/publications/49377/documents/273...
This manifested in the final bill as section 44 - https://www.legislation.gov.uk/ukpga/2023/50/section/44