since you've made a throwaway account to say this, I don't expect you to actually read this reply, so I'm not going to put any effort into writing this, but essentially this is a fundamental lack of understanding of humans, brains, and knowledge in general, and ChatGPT being out 3 years is completely irrelevant to that.

not OP but I found that response compelling. I know that humans also confabulate, but it feels intuitively true to me that humans won't unintentionally make something up out of whole cloth with the same level of detail that an llm will hallucinate at. so a human might say "oh yeah there's a library for drawing invisible red lines" but an llm might give you "working" code implementing your impossible task.

I've seen plenty of humans hallucinating many things unintentionally. This does not track. Some people believe there's an entity listening when you kneel and talk to yourself, others will swear for their lives they saw aliens, they got abducted, etc.

Memories are known to be made up by our brains, so even events that we witnessed will be distorted when recalled.

So I agree with GP, that response shows a pretty big lack of understanding on how our brains work.