I wonder if people are wrongly just going to start calling random, completely-unrelated-to-LLM bugs in software "hallucination". Very similar to the meaning of AI changing in recent years to include basically any software that has some type of algorithm or heuristic built into it.