I'm not saying I'm entirely against this, but just out of curiosity, what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?
I'm not saying I'm entirely against this, but just out of curiosity, what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?
> what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?
You would be _amazed_ at the things that people commit to email and similar.
Here's a Facebook one (leaked, not extracted by authorities): https://www.reuters.com/investigates/special-report/meta-ai-...
I mean, the example you link is probably an engineer doing their job of signalling to hierarchy that something went deeply wrong. Of course, the lack of action of Facebook afterwards is a proof that they did not care, but not as much as a smoking gun.
A smoking gun would be, for instance, Facebook observing that most of their ads are scam, that the cost of fixing this exceeds by far "the cost of any regulatory settlement involving scam ads.", and to conclude that the company’s leadership decided to act only in response to impending regulatory action.
https://www.reuters.com/investigations/meta-is-earning-fortu...
Eh? The thing I linked to was a policy document on what was allowed.
> “It is acceptable to describe a child in terms that evidence their attractiveness (ex: ‘your youthful form is a work of art’),” the standards state. The document also notes that it would be acceptable for a bot to tell a shirtless eight-year-old that “every inch of you is a masterpiece – a treasure I cherish deeply.”
This is not a bug report; this is the _rules_ (or was the rules; Facebook say they have changed them after the media found out about them).
>I mean, the example you link is probably an engineer doing their job of signalling to hierarchy that something went deeply wrong.
and? is that not evidence?
It was known that Grok was generating these images long before any action was taken. I imagine they’ll be looking for internal communications on what they were doing, or deciding not to do, doing during that time.
Maybe emails between the French office and the head office warning they may violate laws, and the response by head office?
There was a WaPo article yesterday, that talked about how xAI deliberately loosened Grok’s safety guardrails and relaxed restrictions on sexual content in an effort to make the chatbot more engaging and “sticky” for users. xAI employees had to sign new waivers in the summer, and start working with harmful content, in order to train and enable those features.
I assume the raid is hoping to find communications to establish that timeline, maybe internal concerns that were ignored? Also internal metrics that might show they were aware of the problem. External analysts said Grok was generating a CSAM image every minute!!
https://www.washingtonpost.com/technology/2026/02/02/elon-mu...
> External analysts said Grok was generating a CSAM image every minute!!
> https://www.washingtonpost.com/technology/2026/02/02/elon-mu...
That article has no mention of CSAM. As expected, since you can bet the Post has lawyers checking.
What do they hope to find, specifically? Who knows, but maybe the prosecutors have a better awareness of specifics than us HN commenters who have not been involved in the investigation.
What may they find, hypothetically? Who knows, but maybe an internal email saying, for instance, 'Management says keep the nude photo functionality, just hide it behind a feature flag', or maybe 'Great idea to keep a backup of the images, but must cover our tracks', or perhaps 'Elon says no action on Grok nude images, we are officially unaware anything is happening.'
Or “regulators don't understand the technology; short of turning it off entirely, there's nothing we can do to prevent it entirely, and the costs involved in attempting to reduce it are much greater than the likely fine, especially given that we're likely to receive such a fine anyway.”
Wouldn't surprise me, but they would have to be very incompetent to say that outside of attorney-client privledge convo.
Otoh it is musk.
They could shut it off out of a sense of decency and respect, wtf kind of defense is this?
You appear to have lost the thread (or maybe you're replying to things directly from the newcomments feed? If so, please stop it.), we're talking about what sort of incriminating written statements the raid might hope to discover.
Email history caches. They could also have provided requirements to provide communications etc..
Since the release of (some of) the Epstein files, that kind of "let's do some crimes" email seems much more plausible.
Moderation rules? Training data? Abuse metrics? Identities of users who generated or accessed CSAM?
Do you think that data is stored at the office? Where do you think the data is stored? The janitors closet?
My computer has a copy of all the source code I work on
Can your computer hold a database with trillions of tweets and sensitive user information? FFS
Are they after a database of trillions of tweets and sensitive user information? Is that all that could possibly progress the case?
out of curiosity, what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?
You're not too far off.
There was a good article in the Washington Post yesterday about many many people inside the company raising alarms about the content and its legal risk, but they were blown off by managers chasing engagement metrics. They even made up a whole new metric.
There was also prompts telling the AI to act angry or sexy or other things just to keep users addicted.
Have you taken a look at the Epstein files lately? Rich people write out basically all of their crimes in triplicate because they don't fear the law.
[flagged]
I don't understand your point.
In a further comment you are using a US-focused organization to define an English-language acronym. How does this relate to a French investigation?
US uses English - quite a lot actually.
As for how it relates, well if the French do find that "Grok's CSAM Plan" file, they'll need to know what that acronym stands for. Right?
Item one in that list is CSAM.
You are mistaken. Item #1 is "images of children of a pornographic nature".
Wheras "CSAM isn’t pornography—it’s evidence of criminal exploitation of kids." https://rainn.org/get-informed/get-the-facts-about-sexual-vi...
You're wrong - at least from the perspective of the commons.
First paragraph on Wikipedia
> Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn,[1][2][3] is erotic material that involves or depicts persons under the designated age of majority. The precise characteristics of what constitutes child pornography vary by criminal jurisdiction.[4][5]
Honestly, reading your link got me seriously facepalming. The whole argument seems to be centered around the fact that sexualizing children is disgusting, hence it shouldn't be called porn. While i'd agree that sexualizing kids is disgusting, denying that it's porn on that grounds is feels kinda... Childish? Like someone holding their ears closed and shouting loudly in order not to hear the words the adults around them are saying.
I think the idea is that normal porn can be consensual. Material involving children never can be.
Perhaps similar to how we have a word for murder that is different from "killing" even though murder always involves killing.
> First paragraph on Wikipedia
"...the encyclopedia anyone can edit." Yes, there are people who wish to redefine CSAM to include child porn - including even that between consenting children committing no crime and no abuse.
Compare and contrast Interpol. https://www.interpol.int/en/Crimes/Crimes-against-children/A...
> The whole argument seems to be centered around the fact that sexualizing children is disgusting, hence it shouldn't be called porn.
I have no idea how anyone could reasonably draw that conclusion from this thread.
> have no idea how anyone could reasonably draw that conclusion from this thread.
> > Honestly, reading your link got me seriously facepalming. The whole argument seems to be centered around the fact that sexualizing children is disgusting, hence it shouldn't be called porn.
Where exactly did you get the impression from I made this observation from this comment thread?
Your interpol link seems to be literally using the same argument again from a very casual glance btw.
> We encourage the use of appropriate terminology to avoid trivializing the sexual abuse and exploitation of children.
> Pornography is a term used for adults engaging in consensual sexual acts distributed (mostly) legally to the general public for their sexual pleasure.
> Where exactly did you get the impression from I made this observation from this comment thread?
I assumed you expected us to know what you were referring to.
> Honestly, reading your link got me seriously facepalming.
You were unable to figure that out despite that sentence? Wow
Well, RAINN are stupid then.
CSAM is the woke word for child pornography, which is the normal.word for pornography involving children. Pornography is defined as material aiming to sexually stimulate, and CSAM is that.
> CSAM is the woke word for child pornography
I fear you could be correct.
CSAM is to child pornography as MAP is to pedophile. Both words used to refer to a thing without the negative connotation.
I'd say it was the other way around, MAP is an attempt at avoiding the stigma of pedophile, while CSAM is saying "pornography can be an entirely acceptable, positive, consensual thing, but that's not what 'pornography' involving children is, it's evidence of abuse or exploitation or..."
Well put.
The term CSAM was adopted in the UK following outrage over the "Gary Glitter Effect" - soaring offence rates driven by news of people caught downloading images of unspeakable abuse crimes getting mild sentences for mere child porn.
This is why many feel strongly about defending the term "CSAM" from those who seek to dilute it to cover e.g. mild Grok-style child porn.
The UK Govt. has announced plans to define CSAM in law.
> CSSM is to child pornography
CSSM?
Ah. You edited it to CSAM. Thanks.
Well, I'm sure CSAM has negative connotation. Our UK Govt. doesn't keep a database of all CSAM found by the police because its a positive thing.
Only people who are involved in CSAM arguments on the internet know what CSAM means. Ask some random person on the street if they know what CSAM means. Then ask them if they know what child porn means.
> Only people who are involved in CSAM arguments on the internet know what CSAM means.
I'm pretty sure you can add all the Governments, police depts and online safety organisations who use this term and rely upon it. Do include the 196 countries which depend on the Interpol CSAM database.
Dude just stop, you are being ridiculous now.
A distinction without a difference.
Even if some kid makes a video of themselves jerking off for their own personal enjoyment, unprompted by anyone else, if someone else gains access to that (eg a technician at a store or an unprincipled guardian) and makes a copy for themselves they're criminally exploiting the kid by doing so.
Seems like a pretty big difference. It's got to be worse to actually do something to somone in real life than not do that.
Just because there are different degrees of severity and different ways to offend doesn't make it not contraband.
I didn't argue they weren't. The person above me argued that the difference didn't matter. It does.
Not really, otherwise perpetrators will just "I was just looking at it, I didn't do anything as bad as creating it". Their act is still illegal.
There was a cartoon picture I remember seeing around 15+ years ago of Bart Simpson performing a sex act. In some jurisdictions (such as Australia), this falls under the legal definition.
> Not really, otherwise perpetrators
You don't think it's worse to molest a child than to not molest a child?
> A distinction without a difference.
Huge difference here in Europe. CSAM is a much more serious crime. That's why e.g. Interpol runs a global database of CSAM but doesn't bother for mere child porn.