I wonder if checking for false statements or hallucinations is the first step to detect entirely LLM