I’m sure Musk is going to say this is about free speech in an attempt to gin up his supporters. It isn’t. It’s about generating and distributing non consensual sexual imagery, including of minors. And, when notified, doing nothing about it. If anything it should be an embarrassment that France are the only ones doing this.

(it’ll be interesting to see if this discussion is allowed on HN. Almost every other discussion on this topic has been flagged…)

> If anything it should be an embarrassment that France are the only ones doing this.

As mentioned in the article, the UK's ICO and the EC are also investigating.

France is notably keen on raids for this sort of thing, and a lot of things that would be basically a desk investigation in other countries result in a raid in France.

Full marks to France for addressing its higher than average rate of unemployment.

/i

> when notified, doing nothing about it

When notified, he immediately:

  * "implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing" - https://www.bbc.co.uk/news/articles/ce8gz8g2qnlo 

  * locked image generation down to paid accounts only (i.e. those individuals that can be identified via their payment details).
Have the other AI companies followed suit? They were also allowing users to undress real people, but it seems the media is ignoring that and focussing their ire only on Musk's companies...

You and I must have different definitions of the word “immediately”. The article you posted is from January 15th. Here is a story from January 2nd:

https://www.bbc.com/news/articles/c98p1r4e6m8o

> Have the other AI companies followed suit? They were also allowing users to undress real people

No they weren’t? There were numerous examples of people feeding the same prompts to different AIs and having their requests refused. Not to mention, X was also publicly distributing that material, something other AI companies were not doing. Which is an entirely different legal liability.

> Which is an entirely different legal liability.

In UK, it is entirely the same. Near zero.

Making/distributing a photo of a non-consenting bikini-wearer is no more illegal when originated by computer in bedroom than done by camera on public beach.

I thought this was about France

It was... until it diverted. https://news.ycombinator.com/item?id=46870196

The part of X’s reaction to their own publishing I’m most looking forward to seeing in slow-motion in the courts and press was their attempt at agency laundering by having their LLM generate an apology in first-person.

Sorry I broke the law. Oops for reals tho.

Kiddie porn but only for the paying accounts!

Who's going to provide their payment details and then generate kiddie porn?

This is a pretty pragmatic move by Musk.

It's basically a honey trap, the likes of which authorities legitimately use to catch criminals.

Nah, Musk put out a public challenge in January asking anyone able to generate illegal / porno images to reply and tell him how they were able to bypass the safegaurds. Thousands of people tried and failed. I think the most people were able to get is stuff you'd see in an R-rated movie, and even then only for fictional requests as the latest versions of Grok refuse to undress or redress any real person into anything inappropriate.

Here's the mentioned thread: https://x.com/elonmusk/status/2011527119097249996

Who is going to generate kiddie porn on it in the first place? It's not as if a a lack of a credit card is preventing the authorities from figuring anything out. This is beyond ridiculous.

The other LLMs probably don't have the training data in the first place.

Er...

"Study uncovers presence of CSAM in popular AI training dataset"

https://www.theregister.com/2023/12/20/csam_laion_dataset/.

[flagged]

[dead]