Y
Hacker News
new
|
ask
|
show
|
jobs
nohren
4 hours ago
[
-
]
I wonder if checking for false statements or hallucinations is the first step to detect entirely LLM
Please enable JavaScript to continue using this application.