Isn't the fact that there was controversy about these, rather than blind acceptance, evidence that Wikipedia self-corrects?
If you see something wrong in Wikipedia, you can correct it and possibly enter a protracted edit war. There is bias, but it's the bias of the anglosphere.
And if it's a hot or sensitive topic, you can bet the article will have lots of eyeballs on it, contesting every claim.
With LLMs, nothing is transparent and you have no way of correcting their biases.
- if it can survive five years, then it can pretty much survive indefinitely
- beyond blatant falsehoods, there are many other issues that don't self-correct (see the link I shared for details)
I think only very obscure articles can survive for that long, merely because not enough people care about them to watch/review them. The reliability of Wikipedia is inversely proportional to the obscurity of the subject, i.e. you should be relatively safe if it's a dry but popular topic (e.g. science), wary if it's a hot topic (politics, but they tend to have lots of eyeballs so truly outrageous falsehoods are unlikely), and simply not consider it reliable for obscure topics. And there will be outliers and exceptions, because this is the real world.
In this regard, it's no different than a print encyclopedia, except revisions come sooner.
It's not perfect and it does have biases, but again this seems to reflect societal biases (of those who speak English, are literate and have fluency with computers, and are "extremely online" to spend time editing Wikipedia). I've come to accept English Wikipedia's biases are not my own, and I mentally adjust for this in any article I read.
I think this is markedly different to LLMs and their training datasets. There, obscurity and hidden, unpredictable mechanisms are the rule, not the exception.
Edit: to be clear, I'm not arguing there are no controversies about Wikipedia. I know there are cliques that police the wiki and enforce their points of view, and use their knowledge of in-rules and collude to drive away dissenters. Oh well, such is the nature of human groups.
Again, read what Larry Sanger wrote, and pay attention to the examples.
I've read Sanger's article and in fact I acknowledge what he calls systemic bias, and also mentioned hidden cliques in my earlier comment, which are unfortunately a fact of human society. I think Wikipedia's consensus does represent the nonextremist consensus of English speaking, extremely online people; I'm fine with sidelining extremist beliefs.
I think other opinions of Sanger re: neutrality, public voting on articles, etc, are debatable to say the least (I don't believe people voting on articles means anything beyond what facebook likes mean, and so I wonder what Sanger is proposing here; true neutrality is impossible in any encyclopedia; presenting every viewpoint as equally valid is a fool's errand and fundamentally misguided).
But let's not make this debate longer: LLMs are fundamentally more obscure and opaque than Wikipedia is.
I disagree with Sanfer
> I disagree with Sanfer
Disregard that last sentence, my message was cut off, I couldn't finish it, and I don't even remember what I was trying to say :D