Yes, can't remember who said it but LLM's always hallucinate, it is just that they are 90 something percent right.

If I was to drop acid and hallucinate an alien invasion, and then suddenly a xenomorph runs loose around the city while I’m tripping balls, does being right in that one instance mean the rest of my reality is also a hallucination?

Because it seems the point being made multiple times that a perceptual error isn’t a key component of hallucinating, the whole thing is instead just a convincing illusion that could theoretically apply to all perception, not just the psychoactively augmented kind.

Which totally depends on your domain and subdomain.

E.g. Programming in JS or Python: good enough

Programming in Rust: I can scrap over 50% of the code because it will

a) not compile at all (I see this while the "AI" types)

b) not meet the requirements at all