That's a completely reasonable boundary. Privacy and consent are critical, especially when sharing personal messages or conversations. It's fair to expect that your interactions remain private unless you've explicitly agreed otherwise. If you'd like, you can communicate your stance clearly to others in advance, ensuring they're aware of your boundaries regarding the use of your messages with AI tools or other external resources.

I understand why one would think it's funny to feed the parent comment into an LLM but please at least label when you echo such output on the site

I don't think their main concern was the privacy aspect.

What do you think their concern was? I can't see any other issues someone might have.

Energy usage is another. What would happen to world power consumption if 1% of WhatsApp chats would be fed to ChatGPT?

A third reason besides privacy would be the purpose. Is the purpose generating automatic replies? Or automatic summaries because the recipient can't be bothered to read what I wrote? That would be a dick move and a good reason to object as well, in my opinion

> What would happen to world power consumption if 1% of WhatsApp chats would be fed to ChatGPT?

The same thing that happens now, when 100% of power consumption is fed to other purposes. What's the problem with that?

Huh? It's additional power draw in the midst of an energy transition. It's not currently being used differently. What do you mean what's the problem with that?

Also don't forget it's just one of three aspects I can think of off the top of my head. This isn't the only issue with LLMs...

Edit: while typing this reply, I remembered a fourth: I've seen many people object morally/ethically to the training method in terms of taking other people's work for free and replicating it. I don't know how I stand on that one myself yet (it is awfully similar to a human learning and replicating creatively, but clearly on an inhuman scale, so idk) but that's yet another possible reason not to want this

[deleted]

If people need additional power, they pay for it. If they want to pay for extra power, why would we gatekeep whether their need is legitimate or not?

Because of the aforementioned shortage. Paying for more power means coal and gas gets spun up since there aren't enough renewables, and the externalities aren't being paid for by those people

I'm also happy to have them pay for the full cleanup cost rather than discourage useless consumption, but somehow people don't seem to think crazy energy prices are a great idea either

Also you're still very focused on this one specific issue rather than looking at the bigger picture. Not sure if the conversation is going anywhere like this

What's the bigger picture? You said "power usage", "to what purpose?" (you kind of don't get a say in whether I use an LLM to reply to you, though you're free to stop talking to me), and "objections to the training method", which doesn't really seem relevant to the use case, but more of a general objection to LLMs.