You're saying it as if the poor author just had no choice but to let LLM write their bibliography. To avoid hallucinations, maybe just don't let an LLM write any part of your paper?

You can only get in this situation if you let a bullshit generator write your paper, and the fraud is that you are generating bullshit and calling it a paper. No buts. It's impossible to trigger this accidentally, or without reckless disregard for the truth.

Calling LLMs "bullshit generators" in the year 2026 just shows a lack of seriousness.

Not as much of a lack of seriousness as excusing away hallucinations as not that big of a deal in what's supposed to be a researched, scholarly body of work written by humans.

Not really - much of work consists of what David Graeber described as “bullshit jobs”. Now AI and its backers are proposing to automate all that bullshit.

[deleted]

And yet people are trying to defend LLM-generated made-up bullshit citations in scientific papers.