Humans also confabulate (a better metaphor for AI errors than hallucination) when called on to respond without access to the ground truth, and most AI models have limited combination of access and ability to use that access when it comes to checking ground truth.