>The Paris prosecutor's office said it launched the investigation after being contacted by a lawmaker alleging that biased algorithms in X were likely to have distorted the operation of an automated data processing system.

I'm not at all familiar with French law, and I don't have any sympathy for Elon Musk or X. That said, is this a crime?

Distorted the operation how? By making their chatbot more likely to say stupid conspiracies or something? Is that even against the law?

Holocaust denial is illegal in France, for one, and Grok did exactly that on several occasions.

Also, csam and pornographic content using the likeness of unwilling people. Grok’s recent shit was bound to have consequences.

If the French suspected Grok/X of something as serious as CSAM, you can bet they would have mentioned it their statement. They didn't. Porn, they did.

The first two points of the official document, which I re-quote below, are about CSAM.

> complicité de détention d’images de mineurs présentant un caractère pédopornographique

> complicité de diffusion, offre ou mise à disposition en bande organisée d'image de mineurs présentant un caractère pédopornographique

[1]: https://www.tribunal-de-paris.justice.fr/sites/default/files...

> The first two points of the official document, which I re-quote below, are about CSAM.

Sorry, but that's a major translation error. "pédopornographique" properly translated is child porn, not child sexual abuse material (CSAM). The difference is huge.

Quote from US doj [1]:

> The term “child pornography” is currently used in federal statutes and is defined as any visual depiction of sexually explicit conduct involving a person less than 18 years old. While this phrase still appears in federal law, “child sexual abuse material” is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child. In fact, in 2016, an international working group, comprising a collection of countries and international organizations working to combat child exploitation, formally recognized “child sexual abuse material” as the preferred term.

Child porn is csam.

[1]: https://www.justice.gov/d9/2023-06/child_sexual_abuse_materi...

> “child sexual abuse material” is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child.

Yes, CSAM is preferred for material depicting abuse reflecting resulting trauma.

But not for child porn such as manga of fictional children depicting no abuse and traumatising no child.

> Child porn is csam.

"CSAM isn’t pornography—it’s evidence of criminal exploitation of kids."

That's from RAINN, the US's largest anti-sexual violence organisation.

> That's from RAINN, the US's largest anti-sexual violence organisation.

For everyone to make up their own opinion about this poster's honesty, here's where his quote is from [1]. Chosen quotes:

> CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.

> It doesn’t matter if the child agreed to it. It doesn’t matter if they sent the image themselves. If a minor is involved, it’s CSAM—and it’s illegal.

[1]: https://rainn.org/get-the-facts-about-csam-child-sexual-abus...

I agree with that. I'd hope everyone would.

Dude, I litterally provided terminology notice from the DOJ. At this point I don't really know what else will convince you.

> I litterally provided terminology notice from the DOJ

You provided a terminology preference notice from the (non-lawmaking) DOJ containing a suggestion which the (lawmaking) Congress did not take up.

Thanks for that.

And if/when the French in question decide to take it up, I am sure we'll hear the news! :)

They are words for the same thing, it's like arguing they can't seize laptops because the warrant says computers.

Actually it's like arguing they can't seize all computers because the warrant only says laptops. I.e. correct.

Maybe US law makes a distinction, but in Europe there is no difference. Sexual depictions of children (real or not) is considered child pornography and will get you sent to the slammer.

On the contrary, in Europe there is a huge difference. Child porn might get you mere community service, a fine - or even less, as per the landmark court ruling below.

It all depends on the severity of the offence, which itself depends on the category of the material, including whether or not it is CSAM.

The Supreme Court has today delivered its judgment in the case where the court of appeals and district court sentenced a person for child pornography offenses to 80 day fines on the grounds that he had called Japanese manga drawings into his computer. Supreme Court dismiss the indictment.

The judgment concluded that the cartoons in and of itself may be considered pornographic, and that they represent children. But these are fantasy figures that can not be mistaken for real children.

https://bleedingcool.com/comics/swedish-supreme-court-exoner...

> The Supreme Court has today delivered its judgment

For future readers: the [Swedish] supreme court.

Is "it" even a thing which can be guilty of that?

The way chatbots actually work, I wonder if we shouldn't treat the things they say more or less as words in a book of fiction. Writing a character in your novel who is a plain parody of David Irving probably isn't a crime even in France. Unless the goal of the book as such was to deny the holocaust.

As I see it, Grok can't be guilty. Either the people who made it/set its system prompt are guilty, if they wanted it to deny the holocaust. If not, they're at worst guilty of making a particularly unhinged fiction machine (as opposed to the more restrained fiction machines of Google, Anthropic etc.)

GDPR has some stuff about biased algorithms. It's all civil, of course, no prison time for that, just fines.

> I'm not at all familiar with French law, and I don't have any sympathy for Elon Musk or X. That said, is this a crime?

GDPR and DMA actually have teeth. They just haven't been shown yet because the usual M.O. for European law violators is first, a free reminder "hey guys, what you're doing is against the law, stop it, or else". Then, if violations continue, maybe two or three rounds follow... but at some point, especially if the violations are openly intentional (and Musk's behavior makes that very very clear), the hammer gets brought down.

Our system is based on the idea that we institute complex regulations, and when they get introduced and stuff goes south, we assume that it's innocent mistakes first.

And in addition to that, there's the geopolitical aspect... basically, hurt Musk to show Trump that, yes, Europe means business and has the means to fight back.

As for the allegations:

> The probe has since expanded to investigate alleged “complicity” in spreading pornographic images of minors, sexually explicit deepfakes, denial of crimes against humanity and manipulation of an automated data processing system as part of an organised group, and other offences, the office said in a statement Tuesday.

The GDPR/DMA stuff just was the opener anyway. CSAM isn't liked by authorities at all, and genocide denial (we're not talking about Palestine here, calm your horses y'all, we're talking about Holocaust denial) is a crime in most European jurisdiction (in addition to doing the right-arm salute and other displays of fascist insignia). We actually learned something out of WW2.

[flagged]