> blame the victims

If you post something publicly you cant be complaining that it is public.

But I can complain about what happens to said something. If my blog photo becomes deep fake porn am I allowed to complain or not ? What we have is an entirely novel situation (with ai) worth at least a serious discussion.

FWIW...I really don't think so. If you say, posted your photo on a bulletin board in your local City Hall, can you prevent it from being defaced? Can you choose who gets to look at it? Maybe they take a picture of it and trace it...do you have any legal ground there? (Genuine Question). And even if so...It's illegal to draw angry eyebrows on every face on a billboard but people still do it...

IMO, it being posted online to a publicly accessible site is the same. Don't post anything you don't want right-click-saved.

GDPR right to erasure says I can demand my personal data to be deleted, and I don't see any language limiting that to things _I_ submitted.

No. Don't give the entire world access to your photo. Creating fakes using photoshop was a thing well before AI.

> If my blog photo becomes deep fake porn

Depends. In most cases, this thing is forbidden by law and you can claim actual damages.

That's helpful if they live in the same country, can figure out who the 4chan poster was, the police are interested (or you want to risk paying a lawyer), you're willing to sink the time pursuing such action (and if criminal, risk adversarial LEO interaction), and are satisfied knowing hundreds of others may be doing the same and won't be deterred. Of course, friends and co-workers are too close to you to post publicly when they generate it. Thankfully, the Taylor Swift laws in the US have stopped generation of nonconsensual imagery and video of its namesake (it hasn't).

Daughter's school posted pictures of her online without an opt-out, but she's also on Facebook from family members and it's just kind of... well beyond the point of trying to suppress. Probably just best to accept people can imagine you naked, at any age, doing any thing. What's your neighbor doing with the images saved from his Ring camera pointed at the sidewalk? :shrug:

I am not talking about 4chan poster. I am talking if a company does it.

Don't have a blog photo in the first place.

> But I can complain about what happens to said something

no.

> but ...

no.

Sure, and if I put out a local lending library box in my front yard I shouldn't by annoyed by the neighbor that takes every book out of it and throws it in the trash.

Decorum and respect expectations don't disappear the moment it's technically feasible to be an asshole

That's a bad analogy. Most people including me do expect that their "public" data is used for AI training. I mean based on the ads everyone gets, most people know and expect completely well that anything they post online would be used in AI.

Are you trying to argue that 10 years ago when I uploaded my resume to linkedin, that I should have known it'd be used for AI training?

Or that teenager that signed up to facebook should know that the embarrassing things they're posting is going to train AI and is, as you called it, public?

What about the blog I started 25 years ago and then took down but it lives in the geocities archive. Was I supposed to know it'd go to an AI overlord corporation when I was in middle school writing about dragon photos I found on google?

And we're not even getting into data breaches, or something that was uploaded as private and then sold when the corporation changed their privacy policy decades after it was uploaded.

It's not a bad analogy when you don't give all the graces to corporations and none to the exploited.

"Corporations".... you gave access to the whole world, including criminals.

> Most people including me do expect that their "public" data is used for AI training.

Based on what ordinary people have been saying, I don't think this is true. Or, maybe it's true now that the cat is out of the bag, but I don't think most people expected this before.

Most tech-oriented people did, of course, but we're a small minority. And even amongst our subculture, a lot of people didn't see this abuse coming. I didn't, or I would have removed all of my websites from the public web years earlier than I did.

> Most tech-oriented people did

In fact it's the opposite. People who aren't into tech thinks Instagram is listening to them 24*7 to show feed and ads. There was even a hoax near my area among elderly groups that Whatsapp is using profile photo in illegal activity and many people removed their photo one time.

> I didn't, or I would have removed all of my websites from the public web years earlier than I did.

Your comment is public information. In fact posting anything in HN is a sure shot way to giving your content for AI training.

> People who aren't into tech thinks Instagram is listening to them 24*7 to show feed and ads

True, but that's a world different than thinking that your data will be used to train genAI.

> In fact posting anything in HN is a sure shot way to giving your content for AI training.

Indeed so, but HN seems to be a bad habit I just can't kick. However, my comments here are the entirety of what I put up on the open web and I intentionally keep them relatively shallow. I no longer do long-form blogging or make any of my code available on the open web.

However, you're right. Leaving HN is something that I need to do.

No, the average person has no idea what “ai training” even is. Should the average person have an above average iq? Yes. Could they? No. Don’t be average yourself.

Seriously, when YOU posted something on the Internet 20 years ago you expected it to be used by a corporation to train an AI 20 years later?

Data sourcing has been a discussion, at least in AI circles, for much longer than 20 years.

So if you are asking me, I would have to say yes. I cannot speak for the original poster.