Wikipedia Deploying - ONPASSIVE

Wikipedia is the largest online encyclopedia, where it allows anyone to edit the information provided that the information is legit. In crowdsourcing the designing of an encyclopedia, a non-profit website forever changed the way we get our information. It’s among the ten most-visited websites on the Internet.

But, if anyone can edit Wikipedia, anyone can falsely add false information. And anyone can hamper the site quality, purposefully adding wrong information.

A senior research scientist – Aaron Halfaker, at the Wikimedia organization, the foundation that oversees Wikipedia, built an AI engine to identify such vandalism and audit every change that goes on in the site.

In a sense, this means it reduces the work of volunteer editors who audit the Wikipedia’s articles. And it might seem like wiping these editors out, another example of Artificial Intelligence taking over human jobs. But the AI project of Wikipedia is mainly focussed on increasing human participation in Wikipedia.

Related: How to Create a Blog: A Step-by-Step Guide

AI-driven Wikis:

Although it is predicted that AI and robotics would replace as much as 47% of our jobs over the next two decades, while there are a few people who believe that AI will also create a number of new jobs.  

This new AI-driven wikis on Wikipedia aims at boosting participation by making Wikipedia more user friendly to modern editors. With the help of a set of open-source machine learning algorithms such as SciKit Learn code freely accessible to the world at large, the service aims to automatically identify malicious vandalism and set it apart from well-intentioned edits.

With a more subtle view of new edits, the thinking goes, these algorithms can go on cracking down on several edits without hampering legitimate participants.

It’s not that Wikipedia needs to implement automated tools to attract more human editors. It’s that Wikipedia needs better-automated tools for a better future.

Objective Revision Evaluation Service an AI-driven tool on Wikipedia:

The software used in Wikipedia is The Objective Revision Evaluation Service that has been trained by Wikipedia editors to identify the quality of a change made my new editors based on the language and context of the change.

Every day, about half a million changes are made to Wikipedia. Editors and usual readers will now be able to quickly audit how likely it is a proposed change is “damaging”.

Related: Tips To Write Best Blog Headlines

In the big picture, the new AI algorithms developed at Wikipedia are rather simple examples of machine learning. But they are proven to be effective in simplifying the job of many editors.

AI programs work by analyzing certain words, changes of certain words, or even a particular keyboard patterns. For example, they can identify unusually large blocks of characters. A few editors on Wikipedia tend to mash the keyboard and not add spaces in between their characters. AI can help eliminate these mishaps and maintain the quality of the information that is shared with the whole world.

Halfaker, a person who is responsible for the AI in Wikipedia, acknowledges that it can’t go to catch every piece of vandalism and changes made on Wikipedia. Still, he believes it can catch most, which helps editors a lot by simplifying their job.

Related: What Is the Difference Between a Wiki and a Blog?