Wikipedia bots are Internet bots (computer programs) that perform simple, repetitive tasks on Wikipedia.
The first bot appeared in 2002. [1]
Bots on Wikipedia must be approved before activation [2] [3] and adhere to a Bot Policy. [4] [5]
Computer programs, called bots, have often been used to automate simple and repetitive tasks, such as correcting common misspellings and stylistic issues, or to start articles, such as geography entries, in a standard format from statistical data. [6] [7] [8] Additionally, there are bots designed to automatically notify editors when they make common editing errors (such as unmatched quotes or unmatched parentheses). [9]
One prominent example of an internet bot used in Wikipedia is Lsjbot, which had created generated one million short 'barebones' articles across various language editions of Wikipedia by 2013. [10] [2] [3] Bots have been most likely to tackle technical topics like space objects and chemical elements. [2] According to Andrew Lih, the current expansion of Wikipedia to millions of articles would be difficult to envision without the use of such bots. [11] The Cebuano, Swedish and Waray Wikipedias are known to have high numbers of bot-created content. [12]
One notable development leading up to 2015 has been the use of bots to perform vandalism-fighting chores in place of human labor. According to estimates, 50% of all vandalism is already eliminated by bots. Human patrollers have congratulated the bots on their accuracy and speed in a number of remarks posted on their talk pages. [13] Anti-vandalism bots like ClueBot NG, created in 2010 are programmed (in its case using an artificial neural network trained on past instances of vandalism) to detect and revert vandalism quickly. [7] [14] Bots are able to indicate edits from particular accounts or IP address ranges, as occurred at the time of the shooting down of the MH17 jet incident in July 2014 when it was reported edits were made via IPs controlled by the Russian government. [15]
On Wikipedia, bots typically engage in more reciprocal and prolonged conversations than humans. However, bots in various cultural contexts may act differently, much like people. According to research on bots on Wikipedia from 2001-2010, even comparatively "dumb" bots have the potential to produce complex relationships, which the authors believe have important consequences for the study of artificial intelligence. [16] [5] Comprehending the factors that influence bot-bot interactions can be useful for improving their performance. [17] Many of the conflicts between bots on Wikipedia ended in 2013 after an update to the way inter-language links worked. [5]
One way to sort bots is by what activities they perform: [18] [1]