The correct nuance here is...
* Classifying accounts as child accounts (moderated by a parent)
* Allowing account moderators to review content in the account that is moderated (including assigning other moderation tools of choice)
In call cases transparency and enabling consumer choice should be the core focus.
Additionally: by default treat everyone online as an adult. Parents that allow their kids online like that without supervision / some setting that the user agent is operated by a child intend to allow their children to interact with strangers. This tends to work out better in more controlled and limited circumstances where the adults involved have the resources to provide suitable supervision.
At the same time, any requirements should apply only to commercial products. Community (gratis / not for profit) efforts presumably reflect the needs of a given community.
I think this is the way. Not control, but just make it simpler for parents to handle their childrens devices. You dont have to make everyone share their age, you just make it so that parents can in a simpler way choose what the children should be able to access. Make it easy to do right, dont add more control. Its kind of the old anti-piracy copyprotections. The pirates always cracked it, and in the end, the ones who got to sit there trying to figure out what is the word in the manual is the user who actually paid for the game. So making it worse for the ones who paid, and better for the cracked version. So, make it simple.
Children are not just the responsibility of parents it is the job of society-through government- to protect and support them.
So no we can't just tell parents to deal with it.
The laws being made to “protect” children seem to always have the side effect of removing security and privacy protections from adults.
There has to be a middle ground.
The tech community has had pretty much free reign over the last few decades, and has always chosen adult convenience over child safety (and mostly profit over both). The "middle ground" probably involves a bigger transfer than this.
There's probably a much better solution than "adults vs children" but very few with our expertise seem seriously interested in solving for safer children, which essentially leads to inexpert solutions gaining popular support.
I won't call myself an expert in this field, and haven't given it much thought, but a couple options just off the top of my head...
1. Limit child accounts to "classic" social network functionality. They get to see things from mutual friends. No algorithmic feeds, kids aren't in the user search, and no way for messages to be sent/received unless both sides have consented.
2. Disable chat for child accounts. How many chat apps do children really need? Each one is another potential vector for issues that parents would need to monitor.
I'm sure there is a monkey paw here, but either option seems better than no end-to-end encryption for anyone, at a time when government surveillance is a bigger issue than ever.
Frankly, I think option 1 would be better for all users, not just children. Go back to classic "social networks". This "social media" experiment has failed.
This is not a widely shared opinion.
Is that definitely not a widely shared opinion?
I feel like I can think of lots of situations where society puts in to protect children rather than leaving it to the parents (age ratings on films and games, YouTube Kids, regulations around advertising to children, the whole concept of school, reduced speed limits around playgrounds to give a few examples off the cuff).
It's fine as long as the state aligns with you and is generally sane.
History shows its not necessary always the case.
My point more was that you said it wasn't a widely shared opinion, but to my mind, it is broadly the status quo. Whether that is good or bad is a separate point.
"it take a village ...."
depends on your culture.
Since when was DJ David Guetta a reactionary?
(Yes, I read your comment history. I was tryinna figure out if you were in fact that guy and saw a bunch of conspiracy-ass right wing misinformation.)
Im not the guy obviously.
Its not reactionary to say you dont want the state to interfere too much in your child éducation.
Whether you are left or right its fine as long as the state aligns with you. But if you open an history book, you will sée it VERY OFTEN happened that states get crazy / ideological or just plain eugénist / liberticide.
Its dangerous to give too much power to the state
I think getting the age thing correct is key to get parental classification to work properly(I think now platforms just ask for a birth date which is lame) e.g
> Surveys by Britain’s tech regulator, Ofcom, find that among children aged 10-12, over half use Snapchat, more than 60% TikTok and more than 70% WhatsApp. All three apps have a notional minimum age of 13: https://archive.ph/y3pQO
Once you get the classification correct — and AI cannot it do this — only via community ombudsman/age verifiers, in a privacy first way*, the app stores can easily tell the app devs what accounts are sensitive and filtering should be much more effective.
*Basically once your age is verified by a real human for your device(using device local encryption to verify biometrics) you are set. No kid should be able to bypass and install apps it on devices that their parents hand to them. There will always be black market devices with these apps, but there are ways of beating those to be very minimal by existing tech.
> only via community ombudsman/age verifiers
Why do you need any third parties whatsoever? Just have the parents do it. They configure a setting in the kid's device which the device uses to determine what content to display. All you need from the app/service is a rating for the content. No third parties should never have to know anything about the user, because the user's device knows that, and the device knows it because the parents do.
This all depends on fantasy tech and/or totalitarian control of tech.
Who verifies that the person verifying the child's age is actually authorised to do that? Who verifies that verification? And so on up. This needs a chain of trust that can only end up at government. And that chain of trust will then be open to being abused by shitty politicians.
What mechanism in (e.g) Linux is responsible for implementing this age verification so that it cannot be tampered with (or trivially overruled by a sudo call)? Which organisation is legally liable if that mechanism doesn't do its job? How can we stop someone from overwriting that mechanism with their own, in an open OS that is deliberately designed to allow anyone with root to change anything on it?
What you propose here is the death of open computing. And I personally believe that we would be much better off as a species if we kept open computing and just taught our kids how to handle social media better.
> What mechanism in (e.g) Linux is responsible for implementing this age verification so that it cannot be tampered with (or trivially overruled by a sudo call)? Which organisation is legally liable if that mechanism doesn't do its job? How can we stop someone from overwriting that mechanism with their own, in an open OS that is deliberately designed to allow anyone with root to change anything on it?
This one is easy. You just don't require all devices to do that. The parent isn't required to give the kid a general purpose computer. You don't need to prevent every device from running DOOM, only one device, and then parents who want to impose such restrictions get the kid one of those.
Thanks for the response. Couple of points:
- The line between "general purpose computer" and "not that" is weird. Android is an implementation of Linux, after all. Probably the best example is a Steam Deck. It's just Arch Linux, you can get to a desktop on it no problem, and you get sudo access and can install whatever you like on it. Are you saying that Responsible Parents should not get their kids a Steam Deck?
- And that raises the point of how responsible are we making parents for technical decisions that they do not necessarily have the knowledge to implement? If a child works out how to circumvent the age restriction and look at boobies (or whatever) and an authority finds out, are the parents liable? Are they likely to be prosecuted? Isn't this just adding more burden and bureaucracy to the job of parenting?
> Are you saying that Responsible Parents should not get their kids a Steam Deck?
I'm saying Authoritarian Parents should not get their kids a Steam Deck. If the kid can run arbitrary code then they can get a VPN and access websites hosted in Eastern Europe and then any of this is moot because there is no law you can impose on Facebook to do anything about it.
> If a child works out how to circumvent the age restriction and look at boobies (or whatever) and an authority finds out, are the parents liable?
No, because the parents rather than the "authorities" (who TF is that anyway?) should be the ones in charge of the decision whether the kid can look at boobies to begin with.
I bought my Steam Deck not knowing that it had Desktop Mode. And I'm an experienced software dev. The average parent is not going to know this.
The devices that offer a mode that blocks all unapproved content are presumably going to advertise it. If you buy something that doesn't say it has anything like that, and then it doesn't, that's the expected result. If you buy a device that says it does and then it doesn't, now you have a bone to pick with the OEM.
It’s very hard to control kids internet access. Impossible really. Even if you do it fine at home, once they go to school it’s whatever policies the school has. Most require laptops and provide internet access.
> it’s whatever policies the school has.
so the school takes on that responsibility, as deputized by the parents.
Kids don't get unfettered access to the streets while at school. They can't take their bikes and ride out at will. What makes the internet and devices any different? The devices provided by the school should be lockdown-able, and kids should not be provided their own device unless there's a parental lock (which is enabled during school hours, and is similarly locked down).
The school does not take responsibility. Schools will tell you what you kid does at home is the parent's responsibility even if it is done on the school device. Parents do not have the ability to configure the content controls on the device itself, so technically sophisticated parents resort to tweaking router settings.
That’s just not how the system works.
Each school brews its own system more or less.
> That’s just not how the system works.
ok then make it work so. i feel it's like this thought flow:
- A causes B. and B is a problem.
- why not do C which causes not B.
- ahh, this wont work because it's A what is now, and C is not.
They could actually in the 80s and 90s.
I feel uncomfortable about the idea of controlling children, even my own. Certainly there is a requirement to protect children from others but I feel like putting in guard rails to prevent children from themselves only leads to making things taboo and, as a result, more interesting.
That's objectively not true. You can just use your own DNS server and lock the settings app either through android mdm policies or an locker app.
No that doesn’t work - tried it.
Google is basically its own private internet. It caches content so you can access all sorts of terrible stuff just from Google.com (and its related domains).
But if you cut Google you cut Google Classroom - which is required.
And Google Classroom itself has many workarounds.
This isn’t just a Google problem. The centralization of the Internet around a few mixed content domains really kills conventional filtering.
Paradoxically, there are so many centralized domains that even if you can block one, it’s just a game of whack a mole.
Eventually you just block the whole internet - and then what’s the point? Take away the 20 most popular mixed content platforms, messaging, etc, and you’re effectively blocking the whole internet.
The kids can’t contact their friends, watch educational videos, or any other legitimate use.
> Classifying accounts as child accounts (moderated by a parent)
Notice also that even if you do this, you still don't need the service to be able to decrypt the content, only the parent.
This could even be generically useful, e.g. you have a messenger used by business and then the messages can be read by the client company's administrator/manager but not the messaging company's.
I don’t agree we should Treat everyone as an adult by default online. We wouldn’t do that in any other circumstances.
And the only reason it is permissible to presumptively treat people as underage until proven otherwise in the physical world is that there isn't a constellation of intermediaries collecting all your habits and preferences when you buy porno magazines or alcohol in person.
Why is the answer people seem to arrive at being "mandatory collection of blackmail material that will ruin careers and relationships" when it comes to the Internet?
Because "somebody has to think of the children." At this point, I am convinced parental instincts are being abused to slowly but surely install more-or-less complete public surveilance. Its a rather obvious approach. You have to appeal to something emotional. And parents, and their apparent unwillingness to take up responsibility for their parenting, are the perfect target. Even child-less adults will chime in to toot the "Somebody has to think of the children" mantra.
Why do people feel they can behave in a way that can be blackmailed.
Just behave
Yeah, how dare someone do or say anything that some random crazy asshole could use to threaten that person's personal or professional life or even put them in danger of physical harm.
To hell with gay kids growing up in very traditional religious areas in much of the world.
That person who made a racist joke on Discord when they were 13 years old? That should be able to ruin them when they're 30!
Someone confiding to a friend over social media DMs that they're in an abusive relationship with someone violent? Well - she shouldn't be surprised when her partner beats her within an inch of her life when he finds out. If only she did what she was told, right?
And let's not forget the cringiest or most sexual thing you've ever said online - make sure that your every utterance in private would pass scrutiny by your employer's HR department!
Seriously...I don't understand people like you. What a small, listless, and unusually safe world you must live in.
You may as well have asked why can't everyone think and act like you as well as live in your particular region of the world with the same friends, family, romantic, and professional opportunities that you've been provided throughout your life.
That doesn't work, unless the system knows everyone's family relationships.
Not guesses. Not is told about and takes on trust. Knows.
There's nothing to stop a kid creating a fake adult account and using it as an adult, perhaps creating their own kid account for "official" use.
Ultimately this is an unsolvable problem without a single source of truth for verified ID and user age.
The only responsible way to do that is to create a global "ID escrow" agency, where ID details are private and aren't available to governments or corporations without a court order, but the agency can provide basic age checks and other privacy services of a limited nature.
Good luck with that idea in this culture.
Meanwhile we have the opposite - real ID is known to governments and corporations, personal habits and beliefs of all kinds can be tracked, there is zero expectation of privacy, and kids still aren't protected.
> Classifying accounts as child accounts
It's ok to drive Dad's truck unless he catches you and tells you no.
Unfair presentation. What they suggested was more akin to, "Assume someone with keys is an adult, and let them start the truck."
Dad should either know his children would never drive the truck without permission, or keep his keys as safe as his wallet (and if he can't trust his kids with keys, you bet his wallet needs protection).