Skip to main content
Advertisement
Advertisement

Commentary

Commentary: Where does Wikipedia go in the age of AI?

Crowdsourced, open edited and free, the site must seize the new technology as an opportunity, says the Financial Times' Elaine Moore. 

Commentary: Where does Wikipedia go in the age of AI?

File photo of a Wikipedia webpage in use on a laptop computer. (Photo: Reuters/Gary Cameron)

LONDON: On the second Sunday of every month, you will find a small group of Wikipedia enthusiasts in a pub near London’s Fleet Street discussing the most wildly obscure facts. Armed with flasks of coffee, laptops and the belief that knowledge should be freely shared, they form a volunteer bastion against the twin internet evils of misinformation and artificial intelligence slop

On a recent Sunday, 15 people showed up, including three women (“more than usual to be honest”, murmurs one). Everyone here has their own specialist interest cotton mills in Lancashire, say, or the 19th-century newspaper launched by Benjamin Disraeli something that got them hooked on creating or correcting Wikipedia entries. It is, they say, addictive to see your work read by millions of people. Still, it can be a bit lonely, so the meetups are important. 

Wikipedia has always been a crowdsourced project. Created in 2001 by Jimmy Wales and Larry Sanger, the online encyclopedia is now a living relic of Gen X’s version of the internet: text heavy, cookie-less, largely anonymous and advert free. Anyone can create a Wikipedia article and anyone else can change it. No matter how fierce political division and online arguments become, consensus must be reached through debate. It remains one of the 10 most popular websites. 

Over the past three years, however, Wikipedia has taken on a new role, acting as the feeding ground for generative AI models. Information curated by hand has been scraped, absorbed and regurgitated into chatbot summaries. Human traffic to the site is falling, though bot traffic is up. As if that wasn’t enough, Elon Musk has taken up arms against what he regards as Wikipedia’s liberal bias, vowing to launch a rival called Grokipedia. 

FIND A REFERENCE

You might think this would all be unsettling to people who have dedicated so much of their free time to building the site. And yet everyone I speak to seems to display a scholarly sense of serenity about its future. 

“If AI is the best way to spread accurate information around the world, then that’s what should be used,” says Mike Peel, a softly spoken radio astronomer in a grey “Wikimedia” T-shirt. Peel, who is also one of Wikipedia’s most prolific editors and sits on its foundation’s board of (unpaid) trustees, thinks AI won’t obliterate the site but might change the way people use it. “We have to look at it philosophically,” he says. “Maybe our role is going to move from content creation to accuracy.”

Accuracy wasn’t always Wikipedia’s calling card. Open editing can result in errors and early, high-profile hoax entries attracted negative attention. The site’s answer was to impose strict requirements for attribution — raising the bar for factual reliability.

“Things can get obnoxious online,” says Charles Matthews, a former Cambridge University maths professor who has been a Wikipedia editor since 2003. “So we put tough content policies. You want us to believe something? Find a reference.” 

Matthews, who has retreated to a corner to help a new editor set up an entry, says source requirements also tend to reduce AI-generated content and subdue arguments. Wikipedia pages on the most contentious subjects involve in-depth, citation-heavy debates. One pub goer who was blocked when an administrator opposed his edits says he has no hard feelings about it. He understands that the administrator (also a volunteer) was trying to guide the edits towards neutrality. (“Besides,” he adds, “it was English Wikipedia, I can still edit the German one” there are more than 300 versions). 

NOT AN EXTINCTION THREAT

AI companies could save themselves a lot of trouble if they followed Wikipedia’s example. Earlier this month, OpenAI proudly claimed that it had succeeded in solving a difficult mathematical problem, only to be informed that the answer had been scraped from an online source without attribution. (“Hoisted by their own GPTards” as Meta’s chief AI scientist Yann LeCun put it).

Instead of viewing AI as an extinction threat, Wikipedia volunteers point out how useful it can be — they already use a machine-learning tool to spot editing vandalism. While Wikipedia can’t replicate the content deal Reddit made with AI companies because its license allows free distribution, it has set up paid subscriptions for instant access to new entries. Google is already a subscriber.

That should help keep the site ticking over. Besides, thanks to the volunteers, it’s not that expensive to run. Reports show operating expenses of US$178 million last year. Infrastructure costs include the use of servers. 

Maybe, says Peel, AI will even enhance Wikipedia’s value. In the age of artificial content, human-made work deserves a premium.

Source: Financial Times/aj
Advertisement

Also worth reading

Advertisement