We often talk about "hallucinations" like it is its own thing, but is there really anything different about it from the LLM's normal output?

AFAICT, no. I think it just means "bad, unhelpful output" but isn't fundamentally different in any meaningful way from their super-helpful top-1% outputs.

It's kind of qualitatively different from the human perspective, so not a useless concept, but I think that is mainly because we can't help anthropomorphizing these things.