This is such a weird issue for me, who is blind. Did Grok undress people, or did Grok show extrapolated images of what people might look like undressed? The "undressed people" framing makes it sound like people physically had their clothes removed. Obviously this did not happen.

But, like.

If I have like ... a mole somewhere under my clothes, Grok cannot know about that right? People will know what they themselves look like naked?

Someone looking at Grok's output learns literally nothing about what the actual person looks like naked, right?

Kinda sounds like somebody should just make something that creates this for every picture ever? Then everybody has a defense -- "fake nudes!" and the pictures are meaningless?

> This is such a weird issue for me, who is blind.

I'm not sure what your mental model is for someone's visual likeness.

I'd propose a blind-inclusive analogy of what is happening on Twitter is anyone can create a realistic sexdoll with the same face and body proportions as any user online.

Doesn't that feel gross, even if the sexdoll's genitalia wouldn't match the real person's?

What part of my original comment said it wasn't gross?

My point is that nobody is getting undressed and no privacy violation is being done. Fake nudes are fake.

I interpreted your last sentence as asserting it was no big deal (ie not gross) because it was all fake, but fair enough if you didn't mean it that way.

But to your main point: if you agree it's gross, do you not agree it is a violation of _something_? What is that thing if not privacy?

You may disagree, but 95% of people in the real world understand what "undressed" means in this context and see it as a gross invasion of privacy.

I knew when this issue hit the fan that you'd get hordes of overly-literal engineer types arguing that the person wasn't actually violated, or that "how is this any different from someone drawing a hyper-realistic picture of someone naked?" I can actually even (well, somewhat anyway) sort of understand this viewpoint. But if you want to die on this hill, you will, most people in the real world would condemn and ostracize you for this viewpoint.

This is the sort of thing that is technically correct, but misses the emotional aspect that people want to be able to control their public perception. Of course people could (and did) do this with older tools or by hand. It doesn't matter to them. And since Elon/X are the villain du jour, it's a good lever to punish them.

>If I have like ... a mole somewhere under my clothes, Grok cannot know about that right?

unless some ex spoke about that gross mole you had in twitter or some data that was scraped somewhere, no.

Not sure what the actual odds are of it knowing if you have a mole or not.

Use the mole example as referring to any physical characteristic hidden by clothing that people want to remain hidden. It's an example to demonstrate that the AI is not "undressing" anybody. It is filling in an extrapolation of pixels which have no clear relationship to the underlying reality. If you have a hidden tattoo, that tattoo is still not visible.

This gets fuzzy because literally everything is correlated -- it may be possible to infer that you are the type of person who might have a tattoo there? But grok doesn't have access to anything that hasn't already been shared. Grok is not undressing anybody, the people using it to generate these images aren't undressing anybody, they are generating fake nudes which have no more relationship to reality than someone taking your public blog posts and then attempting to write a post in your voice.

[flagged]

That crosses into personal attack in a particularly gross way. Please review https://news.ycombinator.com/newsguidelines.html and don't post like this again.

I fundamentally disagree.

It is reductio ad absurdum applied correctly. I made a completely absurd proposal, because the response is obvious - any consideration of "fake porn being fake" will stop mattering super fast once actual faces are put in the picture.

I dispute that I made any personal attack.