
"As the free online encyclopedia Wikipedia marks its 25th anniversary this month, the academic community must confront an uncomfortable truth: we have systematically failed our greatest knowledge commons. Despite a 2005 Nature investigation showing that Wikipedia's accuracy was comparable to Encyclopaedia Britannica's (see Nature 438, 900-901; 2005) - and years of follow-up research confirming that its specialist articles in areas such as health and psychology are often reasonable alternatives to professional sources - academia still treats Wikipedia with unwarranted scepticism."
"However, generative-artificial-intelligence systems trained heavily on Wikipedia are now threatening the future of this free, volunteer-driven resource. The stakes have changed - and academics must take note. Large language models offer instant, Wikipedia-derived answers without any attribution. When AI chatbots provide seemingly authoritative responses drawn from Wikipedia's very pages, why would anyone navigate to the source, let alone contribute to it?"
Academia has largely neglected Wikipedia despite evidence that many articles match professional sources in accuracy, including a 2005 Nature verification and follow-up research showing solid specialist coverage in areas such as health and psychology. Generative AI systems trained heavily on Wikipedia now risk offering instant, unattributed answers, reducing incentives to visit or contribute to the volunteer-driven encyclopedia. That parasitic dynamic threatens a freely accessible, human-curated knowledge commons whose value rests on radical transparency: logged edits and archived discussions. Humans provide uniquely valuable tasks—discovering archival materials, documenting under-covered subjects, and exercising editorial judgment—that AI cannot reliably replace. Active academic support and institutional recognition are necessary to sustain Wikipedia.
Read at Nature
Unable to calculate read time
Collection
[
|
...
]