“Never trust everything you read on the Internet,” Abraham Lincoln once said. That’s no more true than for Wikipedia, which relies on thousands of volunteers to monitor and edit its content. And they’ve now got a new tool to help combat vandalism.
The Wikimedia Foundation, which oversees Wikipedia, has announced a new artificial intelligence service to help editors spot incorrect alterations, called the Objective Revision Evaluation Service (ORES), which is free to anyone to use. Using an algorithm, it is able to work out if an alteration is potentially “damaging” or not.
"This service empowers Wikipedia editors by helping them discover damaging edits and can be used to immediately 'score' the quality of any Wikipedia article," a blog from Wikimedia states. "We’ve made this artificial intelligence available as an open web service that anyone can use."
The algorithm – which is essentially a piece of code that can be used by editors – works by creating models based on assessments made by human Wikipedians, generating scores for edits. It can return a score on the validity of an edit within up to 100 milliseconds, and it is available to use right now, following months of testing.
This also helps to address the issue of new editors feeling shunned. While anyone is free to edit Wikipedia, changes (especially those on highly ranked pages) must be vetted by others and are often denied before they can be added. This new algorithm will more quickly assess the usefulness of a new edit, hopefully encouraging more people to get involved.
"Our hope is that ORES will enable critical advancements in how we do quality control – changes that will both make quality control work more efficient and make Wikipedia a more welcoming place for new editors," the blog added.
Wikipedia has been the target of both serious and light-hearted vandalism in the past, but hopefully this new tool will help to alleviate some of the problems plaguing the site, and prevent so many articles having to be locked from new edits.