Wikipedia artificial intelligence AI vandalism
The ORES artificial intelligence engine will help rid Wikipedia of 'rogue editors' Creative Commons

Wikipedia has rolled out an artificial intelligence (AI) service capable of identifying damaging edits to the online encyclopedia. The Objective Revision Evaluation Service (ORES) will be used to monitor up to 500,000 edits that are made to Wikipedia articles every day, calling attention to the edits that are considered "vandalism".

The AI engine is designed to improve the quality of Wikipedia, while at the same time encouraging more people to volunteer as editors.

"The Objective Revision Evaluation Service (ORES) functions like a pair of X-ray specs, the toy hyped in novelty shops and the back of comic books - but these specs actually work to highlight potentially damaging edits for editors," Aaron Halfaker, a senior research scientist at the Wikimedia Foundation, wrote in a blogpost explaining the technology.

"This allows editors to triage them from the torrent of new edits and review them with increased scrutiny."

'Rogue editors' and blackmail

The open-source nature of Wikipedia means that malicious edits for the purpose of propaganda, promotion, or even blackmail can be an issue.

Earlier this year, Wikipedia took action against a group of "rogue editors" who targeted UK businesses and celebrities in a blackmail scam. Using a network of fake accounts, the criminals demanded money from victims for to avoid damaging content from appearing on their Wikipedia page.

"By combining open data and open source machine learning algorithms, our goal is to make quality control in Wikipedia more transparent, auditable, and easy to experiment with," Halfaker said.

"Our hope is that ORES will enable critical advancements in how we do quality control - changes that will both make quality control work more efficient and make Wikipedia a more welcoming place for new editors."