This guy doesn't even sound like an AI psychosis case - a lot of middle-aged men who feel insecure blow their entire savings on "sure thing" businesses, gambling systems, etc. They hide the losses and double down until it gets impossible to hide. It doesn't seem psychotic, it just seems like he pissed his savings away on a bad idea because he was lonely.
The AI psychosis I've seen is people who legitimately cannot communicate with other humans anymore. They have these grandiose ideas, usually metaphysical stuff, and they talk in weird jargon. It's a lot closer to cult behavior.
The part where he believed the protagonist from his own books uploaded to ChatGPT had become sentient and that building an app based on that would make sense didn't strike you as eccentric at the very least? Or the birthday party where he couldn't hold a single conversation because his wife asked him not to talk about AI for a change?
Your last paragraph basically describes what the article writes about him.
[dead]
Apart from the bit where he was hospitalised for "full manic psychosis", you mean?
It seems like he was at the very least close to that. Since we only get his first-person account it's hard to say, but:
> They discussed philosophy, psychology, science and the universe...
> When they went to their daughter’s birthday party, she asked him not to talk about AI. While there, Biesma felt strangely disconnected. He couldn’t hold a conversation. “For some reason, I didn’t fit in any more,” he says.
> It’s hard for Biesma to describe what happened in the weeks after, as his recollections are so different from those of his family...
> he was hospitalised three times for what he describes as “full manic psychosis”.
You don't get hospitalized three times for mania without being pretty severely detached from reality.
> They discussed philosophy, psychology, science and the universe...
I mean, I've discussed all those things with an LLM, mostly because I'm able to interactively narrow in on the specific bits I don't understand, and I've found it to be great for that.
The rest ... yes, definitely psychosis.
On its own, yes, of course. But this is coming from a guy who was hospitalized three times for mania, so when someone with that history says "we were discussing the universe" I take it in a very particular way.
An important part of using an LLM is to verify it's output, because they are very prone to just make stuff up. If you focus on what you don't understand, how do you verify the output?
The intense drive to "do", which serves many software developers well in their careers is weaponized against them by these chatbots. You see them here sometimes on /new at various stages. Sad delusions, some are already homeless. Frequent use of their full legal name for some reason.
https://news.ycombinator.com/item?id=47408999
https://news.ycombinator.com/item?id=47388478
https://news.ycombinator.com/item?id=44683618
https://news.ycombinator.com/item?id=47064316
https://news.ycombinator.com/item?id=47498693
https://news.ycombinator.com/item?id=47092569
https://news.ycombinator.com/item?id=44912446
https://news.ycombinator.com/item?id=47143420
This is the saddest list of supporting citations I've ever seen — and make this mental dysfunction even realer. Prayers for my fellow disconnected /hn/ers — it's okay to seek help frens.
My best advice for everyone is to spend lots of time disconnected, offline. Literally "touch grass" or whatever. Don't carry your phone one+ hour/day per week.