Disputes on Wikipedia arise from Wikipedians, who are volunteer editors, disagreeing over article content, internal Wikipedia affairs, or alleged misconduct. Disputes often manifest as repeated competing changes to an article, known as "edit wars", where instead of making small changes, edits are "reverted" wholesale. Disputes may escalate into dispute resolution efforts and enforcement.
Disputes are encouraged to be discussed on talk pages, but can go straight to editing bans,[ citation needed ] and some editors just "walk away" from conflict, especially if they do not know how to defend their edits within Wikipedia's complex systems.
An early but persistent source of conflict is "proprietary editing", where an editor, who may have started an article, will not allow other editors to make changes to their content or language.[ citation needed ] Many current conflicts play out in articles about contentious topics, often with two entrenched opposing sides, that reflect debates and conflicts in society, based on ethnic, political, religious, and scientific differences.
Dispute resolution efforts have shifted over the years. For content disputes in English Wikipedia, as of 2024, editors most often resort to Requests for Comment, along with specialized discussion structures, such as Articles for Deletion. For alleged user misconduct, some Wikipedias rely on Arbitration Committees as the final word.
Disputes, editor behavior, and collaboration on Wikipedia have long been the subject of academic research, especially in the English Wikipedia. A 2023 review identified 217 articles about contributor goals, interactions, and collaboration processes, which identified 34 studies of "the causes and impact of conflict, the mechanisms for resolving conflict, and the measurement and prediction of conflict or controversial articles." [1] The review examined numerous studies of editor coordination, especially on Talk pages, as well as algorithmic governance using bots to enforce Wikipedia policies. The review found that research attention peaked in 2012, and overall Wikipedia editing peaked in 2007. [1]
As an open collaboration writing project, from the outset Wikipedia expected disagreements among contributors. The point at which disagreements turn into disputes, and conflicts, is not uniformly defined by Wikipedia communities and the scholars who study them.[ citation needed ]
Conflicts over content within articles often arise among editors, which may result in edit wars. [2] : 62 An edit war is a persistent exchange of edits representing conflicting views on a contested article, [2] : 62 [3] [4] or as defined by the website's policy: "when editors who disagree about the content of a page repeatedly override each other's edits." [5] Edit wars are prohibited on Wikipedia [6] : 146 and editors are encouraged to seek consensus through discussion, however administrative intervention may be applied if discussion is unfruitful in resolving the conflict. [7] Generally, edit wars are provoked by the presence of highly controversial content, [3] such as abortion or the Israeli–Palestinian conflict, but can also occur due to other disputed matters, such as the nationality of artist Francis Bacon. [4] According to a 2020 study, the longest edit war sequence, with 105 reverts by 20 users, was a 2008 tug-of-war over the biography of Turkey's first president, Mustafa Kemal Atatürk. [8]
Researchers also designed an analytical platform, titled Contropedia, to observe and measure protracted editing controversies, such as global warming. [9]
Edit wars may be defined and detected in terms of reverts and mutual re-reverts. In 2004, the community instituted the three revert rule, which was examined in subsequent scholarship. [8] The rule reportedly cut reverts in half. [1] To identify editing disputes, scholars also tried using the number of article revisions, deletion rates between editors, or a tag placed on controversial articles. [10] For example, up to mid-2020, there were in-depth Talk page arguments over 7,425 instances of a dispute tag. [11] In 2012, Yasseri et al. identified disputes through a pattern recognition algorithm and tested it against human evaluations of article. By avoiding language-based criteria, they stated that their method "makes possible both inter-cultural comparisons and cross-language checks and validation". [10] Accordingly, in a 2014 chapter, Yasseri led a different team to identify the most controversial articles in 10 Wikipedias, including Arabic, Hebrew, and Hungarian. [12]
Later research has used other methods, even absent reverts and deletion patterns. A 2021 study claimed 80% accuracy in identifying "conflict-prone discussions" by their structural features, such as back and forth commenting by two editors (ABA pattern), before any contributions by a third person. [13] Other features include phrases and pronoun usage that mark the level of politeness or collaboration. [11] De Kock and Vlachos classified disputes with a natural language processing (NLP) model that improved on feature-based models. [11]
Many disputes center on the deletion of written content, which can be seen as a kind of gatekeeping. [14] In a comparative study of such network gatekeeping on French and Spanish decolonization cases, it was found that more active editors experience fewer deletions and appear to function within rival camps. [14]
Disputes are widely seen as a drain on the Wikipedia community, without adding to useful knowledge, [15] and as creating a competitive [16] and conflict-based culture associated with conventional masculine gender roles. [17] [18] Research has focused on the impoliteness of disputes, which can harm personal identities, "violate boundaries", [19] and diminish voluntarism. [20] Entrenched editor conflicts are said to detract from the quality and purported neutrality of Wikipedia articles. [21] [22]
Occasionally, a behind-the-scenes dispute will garner negative media attention as a Wikipedia controversy. For example, after the 2019 ban of a user by the Wikimedia Foundation, media stories covered the internal debate and the resignation of 21 administrators from English Wikipedia. [23] [24] Nonetheless, adversarial editing has been defended by Wikipedia leadership as important for collaboration [25] and scholars have argued that well-managed friction among editors can benefit the encyclopedia. [1] Controversial topics may also attract editors, as found by a 2017 lab experiment with people exposed to German Wikipedia. [26]
With civility as a core principle of Wikipedia, user disputes often feature impoliteness. According to a study of disputes on 120 Talk pages, by and large "Wikipedians do not prolong the conflicts." The most common incivility is scorn, ridicule, or condescension, followed by "pointed criticism". Impolite comments got no traction, no response, two-fifths of the time. Regardless of the topic area, overt responses were divided: 37 percent of responses to rude conduct were defensive, such as explaining oneself or asking for information about the critic's concern. However, 53.5 percent of the time, people responded offensively. [19] According to a similar study, personal attacks were reciprocated 26% immediately. [27]
Editors use a range of rebuttal tactics, ranging from insults to derailing to counterargument and refutation. Higher quality rebuttals "correlate to more constructive outcomes". [27] Coordination tactics include asking questions, providing information, supplying context, offering a compromise, conceding or admitting lack of knowledge. [27] Deferential wording reduces conflict, such as the phrase "by the way" or hedging to signal an openness to compromise. [11]
During editing disputes, Wikipedians have been found to adopt five conversational roles: architect (of the discussion structure), content expert, moderator, policy wonk, and wordsmith. The edit-focused roles, of expert and wordsmith, tended to be more successful than the conceptual, organizational roles, such as policy wonk. [28] Indeed, when editors bring up Wikipedia policies during a general content dispute, "wiki-lawyering", they tend to escalate the editorial conflict. [11] Still, researchers found that citing Wikipedia policy, such as Notability, does help settle disputes over the deletion of articles. [29]
Editing disputes may go through stages or a life cycle, as David Moats showed for the use of sources in the early days of writing about the Fukushima nuclear accident. [30]
Disagreements over the deletion of articles, and other types of encyclopedic content (e.g., categories and lists), are managed through discussion structures. Notably, Wikipedia (English) has had more than 400,000 Articles for Deletion (AfD) discussions since 2004, though the rate of AfD submissions has declined after Wikipedia article creation was restricted in 2017. [31] [29]
As of 2018, roughly 64 percent of debates ended in deletion and 24 percent in keeping the article, a ratio that is much lower than the early years of Wikipedia. Nearly all discussions are "closed" by a Wikipedian administrator. In 2019, researchers Mayfield and Black created an NLP model to forecast AfD outcomes. [29] Consistent with previous research, they found that the first "vote" (i.e., comment) can generate a "herd effect" and predict outcomes 20 percent or more over the baseline. [29]
Deletion disputes vary among the language Wikipedias. In English Wikipedia, about 20 percent of AfD comments justify their stance with a policy, compared to less than 3 percent in German and Turkish Wikipedia. [32] Long-time Wikipedians play an outsized role in deletion disputes. Although over 160,000 users had spoken up in AfD discussions, over half the debate comments were made by only 1,218 users. This dominance of veteran editors has increased over time. [29]
In English and several other Wikipedias, an Arbitration Committee (ArbCom) handles a variety of intractable disputes, including conflicts among users who edit multiple articles within a topic. The Committee itself defines such a situation as a "contentious topic" and its sanctions may apply expansively to all articles with the topic. Disputes within contentious topics is a distinct area of research, some based on ArbCom cases and others on quantifiable variables. [21]
Some topics appear to be unavoidably polarizing, such as abortion and climate change, although the level of editor conflict may not match the degree of public debate. In addition, a topic may be contentious in one language Wikipedia and not another. [1] A 2014 study identified Israel, Adolf Hitler, The Holocaust, and God as the most hotly debated articles across 10 languages. [33] [12]
Editors have been found to line up in rival camps over contentious articles and topics. [14] [21] [34] It is unclear how much such editors coordinate outside of the Wikipedia platform, contrary to Wikipedia policy. Apparent editor coordination can be detected through discourse analysis, such as the 2020 study of 1,206 contentious articles that found "contentious Wikipedia articles seem to clearly partition others into friends (those who have the same opinion on a given topic) and enemies." [34] At the same time, Wikipedians can enhance their reputations with successful editing, which can influence other editors into like-minded approaches to a contentious topic. The most reputable editors tend to write lasting content and they are less involved in disputes. [34]
In an analysis of 5,414 editor profiles, two types of rival camps were discerned: those whose viewpoints tended to be subsumed and those that tended to be maintained. Those found to "win" an edit war were more likely to ban opposing editors, revert edits, remove competing wikilinks, cite Wikipedia policies, show disrespect, be active in ArbCom proceedings, and especially exert control over cited references. Researchers expressed surprise that Wikipedia policies, designed to ensure balanced viewpoints, were instead leveraged to favor one point-of-view in contentious articles. [21]
Looking at two contentious topics in French Wikipedia, Shroud of Turin and Sigmund Freud, researchers noticed a shift in focus from the editors' conflicting opinions to their disagreements over encyclopedic sources (e.g., are they scientific) and fellow editors (e.g., did they read the sources). Editors argued in adversarial, not collaborative, ways because of personal, non-encyclopedic goals, such as religious commitments, beyond Wikipedia. With Freud, the split among editors could be explained in terms of their competing epistemologies. However, the Shroud of Turin article was vulnerable to the meta-fallacy of bothsideism, according to the case study authors, because the "tenacity" of religious Wikipedians "might simply aim to enable other believers to continue to do so, by illustrating possible lines of argumentative defense, that indeed seem unending". [35]
In a case study of two post-colonial topics, Algeria vs. France, and Gran Colombia vs. Spain, scholars found that the most active, presumably reputable, editors suffered the fewest deletions of their writing. Moreover, evidence suggested that fewer deletions were made by those who make use of Talk pages, as recommended by Wikipedia policy. The two ingroups with the most Wikipedians — France and Gran Colombia — were more likely to delete contributions by their presumed opposition — from Algeria and Spain, respectively. [14]
Soon after its founding, Wikipedia provided avenues to resolve content and conduct disputes. Just as editing disputes are difficult to define precisely, scholars have disagreed about identifying when disputes are resolved. Yasseri et al. categorized articles into three levels of disputation: Consensus, "Sequence of temporary consensuses", and "Never-ending wars". [10]
For content disagreements, Wikipedia has experimented with a variety of mechanisms. Experienced editors have been found to reduce reverts by citing Wikipedia policies, especially "Neutral point of view" (NPOV), "Consensus", and "No original research". Editing disagreements may be resolved by argumentation, compromise, and explaining previous discussions. [1] As of 2024, editors may pursue dispute resolution by requesting a third party opinion, an informal arrangement intended for two editors in disagreement.
If their dispute remains unresolved, another recourse is the Dispute Resolution Noticeboard (DRN). [36] The DRN approach does not offer formal closure or a binding compromise, but many cases are rejected for not pursuing other avenues, so it has become less useful. [37] Of 2,520 DRN cases through mid-2020, there were 237 successful resolutions, 149 failures, and 2,134 (85%) closed without a result. [11]
Moreover, editors may submit content disagreements into the Requests for Comment (RfC) system. These requests, circulated to uninvolved editors by a bot, benefit from the RfC's distinctive structure and the imposition of a 30-day deadline. [37] During a seven year period, (English) Wikipedia had over 7,300 requests for comment discussions. RfC discussions are often closed with a Wikipedia-style "consensus" on the content dispute. However, a significant number "go stale" because they are ignored by veteran editors or, conversely, the RfCs are overwhelmed with comments and too complex or controversial to be closed. [36]
In the past, editors in unresolved content disputes could file for formal mediation by a Mediation Committee, which was discontinued due to inactivity in 2018. Dispute resolution was also provided by informal groups such as the "Mediation Cabal". A 2010 study, cited by Ren et al., found that "mediators can alter the text discussion between conflicting editors (e.g., by striking through some statements), clarify ambiguity, differentiate between personal and substantive arguments, and show the editors how their exchanges could be made more constructive. They can also help manage temporal discontinuities (i.e., when one party is unavailable, the other party may make misattributions), and reduce power differences among editors." [1]
For user conduct issues, in 2003, Jimmy Wales created the Arbitration Committee (ArbCom), an overarching authority for binding resolution of conduct disputes. ArbCom cases are structured in a formal, though it tends to be flexible and informal as it works toward decisions. More than 500 complaints were submitted to ArbCom between 2004 through 2020. ArbCom examines evidence of misconduct but its decisions have been criticized for favoring the more socially effective parties. [38]
One of the first large-scale disputes about Wikipedia was an internal argument over advertising, starting with Larry Sanger and dissent by Spanish editors, which led to a 2002 fork of the Spanish Wikipedia. [39] Edit warring gave rise to the rule against three repeated reverts by the same editor. [1] [10] [37] In 2005–2006, Wikipedians debated whether to display controversial images from the Jyllands-Posten Muhammad cartoons. [39] On internal matters, early disputes included the 2006 userbox controversy, which was resolved partly by placing templates in personal user pages [40] and partly by administration actions by Jimmy Wales. [37]
Meanwhile, in its first decade, Wikipedia set up dispute resolution mechanisms, including the Arbitration Committee, and refined policies to govern and reduce disputes. In its second decade, the Wikimedia Foundation funded and tracked research on disputes. Some Wikipedia dispute resolution efforts were disbanded. [37] A Universal Code of Conduct for all Wikipedia organizations is designed to restrain the most egregious actions, some of which may arise from editing disputes. [41]
The English Wikipedia is the primary English-language edition of Wikipedia, an online encyclopedia. It was created by Jimmy Wales and Larry Sanger on 15 January 2001, as Wikipedia's first edition.
The German Wikipedia is the German-language edition of Wikipedia, a free and publicly editable online encyclopedia.
The Arabic Wikipedia is the Modern Standard Arabic version of Wikipedia. It started on 9 July 2003. As of January 2025, it has 1,251,186 articles, 2,678,073 registered users and 53,932 files and it is the 17th largest edition of Wikipedia by article count, and ranks 7th in terms of depth among Wikipedias. It was the first Wikipedia in a Semitic language to exceed 100,000 articles on 25 May 2009, and also the first Semitic language to exceed 1 million articles, on 17 November 2019.
The free online encyclopedia Wikipedia has been criticized since its creation in 2001. Most of the criticism has been directed toward its content, community of established volunteer users, process, and rules. Critics have questioned its factual reliability, the readability and organization of its articles, the lack of methodical fact-checking, and its political bias.
Wikipedia is a free-content online encyclopedia written and maintained by a community of volunteers, known as Wikipedians, through open collaboration and the wiki software MediaWiki. Wikipedia is the largest and most-read reference work in history, and is consistently ranked among the ten most visited websites; as of December 2024, it was ranked fifth by Semrush, and seventh by Similarweb. Founded by Jimmy Wales and Larry Sanger on January 15, 2001, Wikipedia has been hosted since 2003 by the Wikimedia Foundation, an American nonprofit organization funded mainly by donations from readers.
The reliability of Wikipedia and its user-generated editing model, particularly its English-language edition, has been questioned and tested. Wikipedia is written and edited by volunteer editors, who generate online content with the editorial oversight of other volunteer editors via community-generated policies and guidelines. The reliability of the project has been tested statistically through comparative review, analysis of the historical patterns, and strengths and weaknesses inherent in its editing process. The online encyclopedia has been criticized for its factual unreliability, principally regarding its content, presentation, and editorial processes. Studies and surveys attempting to gauge the reliability of Wikipedia have mixed results. Wikipedia's reliability was frequently criticized in the 2000s but has been improved; its English-language edition has been generally praised in the late 2010s and early 2020s.
The Essjay controversy was an incident in which Ryan Jordan, a Wikipedia editor who went by the username "Essjay", falsely presented himself as a university professor of religion from 2005 to 2007, during which time he was elected to top positions of trust by the community, including administrator and arbitrator. In July 2006, The New Yorker published an article about "Essjay", and mentioned that he was a university professor of religion. The New Yorker later acknowledged that they did not know his real name.
The Japanese Wikipedia is the Japanese edition of Wikipedia, a free, open-source online encyclopedia. Started on 11 May 2001, the edition attained the 200,000 article mark in April 2006 and the 500,000 article mark in June 2008. As of January 2025, it has almost 1,446,000 articles with 12,930 active contributors, ranking fourth in the latter metric behind the English, French and German editions.
Wikipedia has been studied extensively. Between 2001 and 2010, researchers published at least 1,746 peer-reviewed articles about the online encyclopedia. Such studies are greatly facilitated by the fact that Wikipedia's database can be downloaded without help from the site owner.
Deletionism and inclusionism are opposing philosophies that largely developed within the community of volunteer editors of the online encyclopedia Wikipedia. The terms reflect differing opinions on the appropriate scope of the encyclopedia and corresponding tendencies either to delete or to include a given encyclopedia article.
On Wikimedia Foundation projects, an arbitration committee (ArbCom) is a binding dispute resolution panel of editors. Each of Wikimedia's projects are editorially autonomous and independent, and some of them have established their own arbitration committees who work according to rules developed by the project's editors and are usually annually elected by their communities. The arbitration committees generally address misconduct by administrators and editors with access to advanced tools, and a range of "real-world" issues related to harmful conduct that can arise in the context of Wikimedia projects. Rulings, policies and procedures differ between projects depending on local and cultural contexts. According to the Wikimedia Terms of Use, users are not obliged to have a dispute solved by an arbitration committee.
The following outline is provided as an overview of and a topical guide to Wikipedia:
Gender bias includes various gender-related disparities on Wikipedia, particularly the overrepresentation of men among both volunteer contributors and article subjects, as well as lesser coverage of and topics primarily of interest to women.
Common Knowledge? An Ethnography of Wikipedia is a 2014 book about Wikipedia's community of contributors. The author is Dariusz Jemielniak, who is a Wikipedia contributor himself.
The English Wikipedia has been criticized for having a systemic racial bias in its coverage. This bias partially stems from an under-representation of people of color within its volunteer editor base. In "Can History Be Open Source? Wikipedia and the Future of the Past," it is noted that article completeness and coverage is dependent on the interests of Wikipedians, not necessarily on the subject matter itself. The past president of Wikimedia D.C., James Hare, asserted that "a lot of [Black American history] is left out" of Wikipedia, due to articles predominately being written by white editors. Articles about African topics that do exist are, according to some, largely edited by editors from Europe and North America and thus, they only reflect their knowledge and their consumption of media, which "tend to perpetuate a negative image" of Africa. Maira Liriano of the Schomburg Center for Research in Black Culture has argued that the lack of information regarding Black history on Wikipedia "makes it seem like it's not important."
Taha Yasseri is a physicist and sociologist known for his research on crowdsourcing, collective intelligence and computational social science. He is a full professor at the School of Social Sciences and Philosophy at Trinity College Dublin, Ireland. He is the inaugural Workday Chair of Technology and Society. He was formerly a professor of sociology at University College Dublin, a senior research fellow in computational social science at the Oxford Internet Institute (OII), University of Oxford, a Turing Fellow at the Alan Turing Institute for data science and artificial intelligence, and a research fellow in humanities and social sciences at Wolfson College, Oxford. Yasseri is one of the leading scholars in computational social science and his research has been widely covered in mainstream media. Yasseri obtained his PhD in theoretical physics of complex systems at the age of 25 from the University of Göttingen, Germany.
On Wikipedia, ideological bias, especially in its English-language edition, has been the subject of academic analysis and public criticism of the project. Questions relate to whether its content is biased due to the political, religious, or other ideologies its volunteer editors may adhere to. These all draw concerns as to the possible effects this may have on the encyclopedia's reliability.
In Wikipedia and similar wikis, an edit count is a record of the number of edits performed by a certain editor, or by all editors on a particular page. An edit, in this context, is an individually recorded change to the content of a page. Within Wikimedia projects, a number of tools exist to determine and compare edit counts, resulting in their usage for various purposes, with both positive and negative effects.
Volunteer editors of Wikipedia delete articles from the online encyclopedia regularly, following processes that have been formulated by the site's community over time. The most common route is the outright deletion of articles that clearly violate the rules of the website. Other mechanisms include an intermediate collaborative process that bypasses a complete discussion, and a whole debate at the dedicated forum called Articles for deletion (AfD). As a technical action, deletion can only be done by a subset of editors assigned particular specialized privileges by the community, called administrators. An omission that has been carried out can be contested by appeal to the deleting administrator or on another discussion board called Deletion review (DRV).
The once-derided open-source encyclopedia is the closest thing the internet has to an oasis of truth. Now a single-user ban has exposed the deep rifts between Wikipedia's libertarian origins and its egalitarian aspirations, and threatened that stability.