> This has been a rampant problem on Wikipedia always. I can't seem to find any indicator that this has increased recently? Because they're only even investigating articles flagged as potentially AI. So what's the control baseline rate here?

...y'know, I don't want to be that guy, but this actually seems like something AI could check for, and then flag for human review.

You don't need an LLM to find loads of uncorroborated claims on Wikipedia. See f.e. https://en.wikipedia.org/wiki/Variational_autoencoder Most articles about tech are woefully undersourced.