You can't be careful at all doing this, this is like smoking a cigarette in a dynamite factory.

Using LLMs for therapy is so deeply dystopian and disgusting, people need human empathy for therapy. LLMs do not emit empathy.

Complete disaster waiting to happen for that individual.

My experience is that it tries to look at your situation in an objective way, and tries to help you to analyse your thoughts and actions. It comes across as very empathetic though, so there can lie a danger if you are easily persuaded into seeing it as a friend.

>in an objective way

One of the great myths of models in countless fields/industries. LLM’s are absolutely in no way objective.

Now if you want to say it’s an “outside opinion“ that’s valid. But do not kid yourself into thinking it is somehow empirical or objective

It doesn't try to do anything. It doesn't work like that. It regurgitates the most likely tokens found in the training set.

Hmmmm i didn't know that... so a machine is not human is your point? Look, i know it doesn't try, just like a sorting algo does not try to sort, or an article does not try to convey an opinion and a law does not try to make society more organized.

That is so reductive of an analysis that it is almost worthless. Technically true, but very unhelpful in terms of using an LLM.

It is a first principle though so it helps to “stir the context windows pot” by having it pull in research and other shit on the web that will help ground it and not just tell you exactly what you prompt it to say.

They are amazing tools, but when people try to give them agency someone has to explain it in simple terms.

Claudes have lots of empathy. The issue is the opposite - it isn't very good at challenging you and it's not capable of independently verifying you're not bullshitting it or lying about your own situation.

But it's better than talking to yourself or an abuser!

It's about the same as talking to yourself, LLMs simply agree with anything you say unless it is directly harmful. Definitely agree about talking to an abuser, though.

Sometimes people indeed just need validation and it helps them a lot, in that case LLMs can work. Alternatively, I assume some people just put the whole situation into words and that alone helps.

But if someone needs something else, they can be straight up dangerous.

> It's about the same as talking to yourself, LLMs simply agree with anything you say unless it is directly harmful.

They have world knowledge and are capable of explaining things and doing web searches. That's enough to help. I mean, sometimes people just need answers to questions.

> It's about the same as talking to yourself

In one way it's potentially worse than talking to yourself. Some part of you might recognize that you need to talk to someone other than yourself; an LLM might make you feel like you've done that, while reinforcing whatever you think rather than breaking you out of patterns.

Also, LLMs can have more resources and do some "creative" enabling of a person stuck in a loop, so if you are thinking dangerous things but lack the wherewithal to put them into action, an LLM could make you more dangerous (to yourself or to others).

Using an LLM for therapy is like using an iPad as an all-purpose child attention pacifier. Sure, it’s convenient. Sure there’s no immediate harm. Why a stressed parent would be attracted to the idea is obvious… and of course it’s a terrible idea.