Moderation system

Last updated
Comment moderation on a GitHub discussion, where a user called Mallory has deleted several comments before closing the discussion and locking it Deleted comments.png
Comment moderation on a GitHub discussion, where a user called Mallory has deleted several comments before closing the discussion and locking it

On Internet websites that invite users to post comments, a moderation system is the method the webmaster chooses to sort contributions that are irrelevant, obscene, illegal, harmful, or insulting with regards to useful or informative contributions. The purpose of content moderation is to remove or apply a warning label to problematic content or allow users to block and filter content themselves. [1]

Contents

Various types of Internet sites permit user-generated content such as comments, including Internet forums, blogs, and news sites powered by scripts such as phpBB, a Wiki, or PHP-Nuke. Depending on the site's content and intended audience, the webmaster will decide what kinds of user comments are appropriate, then delegate the responsibility of sifting through comments to lesser moderators. Most often, webmasters will attempt to eliminate trolling, spamming, or flaming, although this varies widely from site to site.

Major platforms use a combination of algorithmic tools, user reporting and human review. [1] Social media sites may also employ content moderators to manually inspect or remove content flagged for hate speech or other objectionable content. Other content issues include revenge porn, graphic content, child abuse and propaganda. [1] Some websites must also make their content hospitable to advertisements. [1]

Supervisor moderation

Also known as unilateral moderation, this kind of moderation system is often seen on Internet forums. A group of people are chosen by the webmaster (usually on a long-term basis) to act as delegates, enforcing the community rules on the webmaster's behalf. These moderators are given special privileges to delete or edit others' contributions and/or exclude people based on their e-mail address or IP address, and generally attempt to remove negative contributions throughout the community. They act as an invisible backbone, underpinning the social web in a crucial but undervalued role. [2]

In the case of Facebook, the company has increased the number of content moderators from 4,500 to 7,500 in 2017 due to legal and other controversies. In Germany, Facebook is responsible for removing hate speech within 24 hours of when it is posted. [3]

Social media site Twitter has a suspension policy. Between August 2015 and December 2017 it suspended over 1.2 million accounts for terrorist content in an effort to reduce the number of followers and amount of content associated with the Islamic State. [4]

Commercial content moderation (CCM)

Commercial Content Moderation is a term coined by Sarah T. Roberts to describe the practice of "monitoring and vetting user-generated content (UGC) for social media platforms of all types, in order to ensure that the content complies with legal and regulatory exigencies, site/community guidelines, user agreements, and that it falls within norms of taste and acceptability for that site and its cultural context." [5]

While at one time this work may have been done by volunteers within the online community, for commercial websites this is largely achieved through outsourcing the task to specialized companies, often in low-wage areas such as India and the Philippines. Outsourcing of content moderation jobs grew as a result of the social media boom. With the overwhelming growth of users and UGC, companies needed many more employees to moderate the content. In the late 1980s and early 1990s, tech companies began to outsource jobs to foreign countries that had an educated workforce but were willing to work for cheap. [6]

Employees work by viewing, assessing and deleting disturbing content, and may suffer psychological damage. [7] [8] [9] [10] [2] [11] Secondary trauma may arise, with symptoms similar to PTSD. [12] Some large companies such as Facebook offer psychological support [12] and increasingly rely on the use of Artificial Intelligence (AI) to sort out the most graphic and inappropriate content, but critics claim that it is insufficient. [13] [14]

Facebook

Facebook has decided to create an oversight board that will decide what content remains and what content is removed. This idea was proposed in late 2018. The "Supreme Court" at Facebook is to replace making decisions in an ad hoc manner. [14]

Distributed moderation

Distributed moderation comes in two types: user moderation and spontaneous moderation.

User moderation

User moderation allows any user to moderate any other user's contributions. Billions of people are currently making decisions on what to share, forward or give visibility to on a daily basis. [15] On a large site with a sufficiently large active population, this usually works well, since relatively small numbers of troublemakers are screened out by the votes of the rest of the community. Strictly speaking, wikis such as Wikipedia are the ultimate in user moderation,[ citation needed ] but in the context of Internet forums, the definitive example of a user moderation system is Slashdot.

For example, each moderator is given a limited number of "mod points," each of which can be used to moderate an individual comment up or down by one point. Comments thus accumulate a score, which is additionally bounded to the range of -1 to 5 points. When viewing the site, a threshold can be chosen from the same scale, and only posts meeting or exceeding that threshold will be displayed. This system is further refined by the concept of karma—the ratings assigned to a user's' previous contributions can bias the initial rating of contributions he or she makes.

On sufficiently specialized websites, user moderation will often lead to groupthink, in which any opinion that is in disagreement with the website's established principles (no matter how sound or well-phrased) will very likely be "modded down" and censored, leading to the perpetuation of the groupthink mentality. This is often confused with trolling.[ citation needed ]

User moderation can also be characterized by reactive moderation. This type of moderation depends on users of a platform or site to report content that is inappropriate and breaches community standards. In this process, when users are faced with an image or video they deem unfit, they can click the report button. The complaint is filed and queued for moderators to look at. [16]

Spontaneous moderation

Spontaneous moderation is what occurs when no official moderation scheme exists. Without any ability to moderate comments, users will spontaneously moderate their peers through posting their own comments about others' comments. Because spontaneous moderation exists, no system that allows users to submit their own content can ever go completely without any kind of moderation.[ citation needed ]

See also

Related Research Articles

Slashdot is a social news website that originally billed itself as "News for Nerds. Stuff that Matters". It features news stories on science, technology, and politics that are submitted and evaluated by site users and editors. Each story has a comments section attached to it where users can add online comments. The website was founded in 1997 by Hope College students Rob Malda, also known as "CmdrTaco", and classmate Jeff Bates, also known as "Hemos". In 2012, they sold it to DHI Group, Inc.. In January 2016, BIZX acquired both slashdot.org and SourceForge. In December 2019, BIZX rebranded to Slashdot Media.

<span class="mw-page-title-main">Website</span> Set of related web pages served from a single web domain

A website is a collection of web pages and related content that is identified by a common domain name and published on at least one web server. Examples of notable websites are Google, Facebook, Amazon, and Wikipedia.

Meta-moderation is a second level of comment moderation. A user is invited to rate a moderator's decision. He is shown a post that was moderated up or down and marks whether the moderator acted fairly. This is used to improve the quality of moderation.

Social software, also known as social apps, include communication and interactive tools often based on the Internet. Communication tools typically handle the capturing, storing and presentation of communication, usually written but increasingly including audio and video as well. Interactive tools handle mediated interactions between a pair or group of users. They focus on establishing and maintaining a connection among users, facilitating the mechanics of conversation and talk. Social software generally refers to software that makes collaborative behaviour, the organisation and moulding of communities, self-expression, social interaction and feedback possible for individuals. Another element of the existing definition of social software is that it allows for the structured mediation of opinion between people, in a centralized or self-regulating manner. The most improved area for social software is that Web 2.0 applications can all promote cooperation between people and the creation of online communities more than ever before. The opportunities offered by social software are instant connection and the opportunity to learn.An additional defining feature of social software is that apart from interaction and collaboration, it aggregates the collective behaviour of its users, allowing not only crowds to learn from an individual but individuals to learn from the crowds as well. Hence, the interactions enabled by social software can be one-on-one, one-to-many, or many-to-many.

<span class="mw-page-title-main">Internet forum</span> Online discussion site

An Internet forum, or message board, is an online discussion site where people can hold conversations in the form of posted messages. They differ from chat rooms in that messages are often longer than one line of text, and are at least temporarily archived. Also, depending on the access level of a user or the forum set-up, a posted message might need to be approved by a moderator before it becomes publicly visible.

<span class="mw-page-title-main">MetaFilter</span> General-interest community weblog

MetaFilter, known as MeFi to its members, is a general-interest community weblog, founded in 1999 and based in the United States, featuring links to content that users have discovered on the web. Since 2003, it has included the popular question-and-answer subsite Ask MetaFilter. The site has eight paid staff members as of December 2021, including the owner. MetaFilter has about 12,000 active members as of early 2011.

Plastic.com (2001–2011) was a general-interest internet forum running under the motto 'Recycling the Web in Real Time'.

nofollow is a setting on a web page hyperlink that directs search engines not to use the link for page ranking calculations. It is specified in the page as a type of link relation; that is: <a rel="nofollow" ...>. Because search engines often calculate a site's importance according to the number of hyperlinks from other sites, the nofollow setting allows web site authors to indicate that the presence of a link is not an endorsement of the target site's importance.

<span class="mw-page-title-main">Reddit</span> Social news aggregation, web content rating, livestreaming, and discussion platform

Reddit is an American social news aggregation, content rating, and discussion website. Registered users submit content to the site such as links, text posts, images, and videos, which are then voted up or down by other members. Posts are organized by subject into user-created boards called "communities" or "subreddits". Submissions with more upvotes appear towards the top of their subreddit and, if they receive enough upvotes, ultimately on the site's front page. Reddit administrators moderate the communities. Moderation is also conducted by community-specific moderators, who are not Reddit employees.

<span class="mw-page-title-main">User-generated content</span> Online content created by users

User-generated content (UGC), alternatively known as user-created content (UCC), is any form of content, such as images, videos, text, and audio, that has been posted by users on online platforms such as social media, discussion forums and wikis. It is a product consumers create to disseminate information about online products or the firms that market them.

A social news website is a website that features user-posted stories. Such stories are ranked based on popularity, as voted on by other users of the site or by website administrators. Users typically comment online on the news posts and these comments may also be ranked in popularity. Since their emergence with the birth of Web 2.0, social news sites have been used to link many types of information, including news, humor, support, and discussion. All such websites allow the users to submit content and each site differs in how the content is moderated. On the Slashdot and Fark websites, administrators decide which articles are selected for the front page. On Reddit and Digg, the articles that get the most votes from the community of users will make it to the front page. Many social news websites also feature an online comment system, where users discuss the issues raised in an article. Some of these sites have also applied their voting system to the comments, so that the most popular comments are displayed first. Some social news websites also have a social networking service, in that users can set up a user profile and follow other users' online activity on the website.

<span class="mw-page-title-main">Block (Internet)</span> Restriction on accessing an online resource

On the Internet, a block or ban is a technical measure intended to restrict access to information or resources. Blocking and its inverse, unblocking, may be implemented by the owners of computers using software. Some countries, notably China and Singapore, block access to certain news information. In the United States, the Children's Internet Protection Act requires schools receiving federal funded discount rates for Internet access to install filter software that blocks obscene content, pornography, and, where applicable, content "harmful to minors".

<span class="mw-page-title-main">Everything2</span> Web-based community

Everything2 is a collaborative Web-based community consisting of a database of interlinked user-submitted written material. E2 is moderated for quality, but has no formal policy on subject matter. Writing on E2 covers a wide range of topics and genres, including encyclopedic articles, diary entries, poetry, humor, and fiction.

<i>AbsolutePunk</i> American music website

AbsolutePunk was a website, online community, and alternative music news source founded by Jason Tate. The website mainly focused on artists who are relatively unknown to mainstream audiences, but it was known to feature artists who have eventually achieved crossover success, including Blink-182, Fall Out Boy, My Chemical Romance, New Found Glory, Brand New, Taking Back Sunday, The Gaslight Anthem, Anberlin, Thrice, All Time Low, Jack's Mannequin, Yellowcard, Paramore, Relient K, and A Day to Remember. The primary musical genres of focus were emo and pop punk, but other genres were included.

Online participation is used to describe the interaction between users and online communities on the web. Online communities often involve members to provide content to the website and/or contribute in some way. Examples of such include wikis, blogs, online multiplayer games, and other types of social platforms. Online participation is currently a heavily researched field. It provides insight into fields such as web design, online marketing, crowdsourcing, and many areas of psychology. Some subcategories that fall under online participation are: commitment to online communities, coordination & interaction, and member recruitment.

<span class="mw-page-title-main">Quora</span> Question-and-answer platform

Quora is a social question-and-answer website based in Mountain View, California. It was founded on June 25, 2009, and made available to the public on June 21, 2010. Users can collaborate by editing questions and commenting on answers that have been submitted by other users. As of 2020, the website was visited by 300 million users a month.

Reblogging is the mechanism in blogging which allows users to repost the content of another user's post with an indication that the source of the post is another user.

<span class="mw-page-title-main">Like button</span> Communication software feature used to express support

A like button, like option, or recommend button, is a feature in communication software such as social networking services, Internet forums, news websites and blogs where the user can express that they like, enjoy or support certain content. Internet services that feature like buttons usually display the number of users who liked each content, and may show a full or partial list of them. This is a quantitative alternative to other methods of expressing reaction to content, like writing a reply text. Some websites also include a dislike button, so the user can either vote in favor, against or neutrally. Other websites include more complex web content voting systems, for example five stars or reaction buttons to show a wider range of emotion to the content.

<span class="mw-page-title-main">Minds</span> Open-source social networking service

Minds is a blockchain-based social network. Users can earn money or cryptocurrency for using Minds, and tokens can be used to boost their posts or crowdfund other users. Minds has been described as more privacy-focused than mainstream social media networks. Writers in The New York Times, Engadget, and Vice have noted the volume of far-right users and content on the platform. Minds describes itself as focused on free speech, and minimally moderates the content on its platform. Its founders have said that they do not remove extremist content from the site out of a desire to deradicalize those who post it through civil discourse.

The comments section is a feature on most online blogs, news websites, and other websites in which the publishers invite the audience to comment on the published content. This is a continuation of the older practice of publishing letters to the editor. Despite this, comments sections can be used for more discussion between readers.

References

  1. 1 2 3 4 Grygiel, Jennifer; Brown, Nina (June 2019). "Are social media companies motivated to be good corporate citizens? Examination of the connection between corporate social responsibility and social media safety". Telecommunications Policy. 43 (5): 2, 3. doi:10.1016/j.telpol.2018.12.003 . Retrieved 25 May 2022.
  2. 1 2 "Invisible Data Janitors Mop Up Top Websites - Al Jazeera America". aljazeera.com.
  3. "Artificial intelligence will create new kinds of work". The Economist. Retrieved 2017-09-02.
  4. Gartenstein-Ross, Daveed; Koduvayur, Varsha (26 May 2022). "Texas's New Social Media Law Will Create a Haven for Global Extremists". foreignpolicy.com. Foreign Policy. Retrieved 27 May 2022.
  5. "Behind the Screen: Commercial Content Moderation (CCM)". Sarah T. Roberts | The Illusion of Volition. 2012-06-20. Retrieved 2017-02-03.
  6. Elliott, Vittoria; Parmar, Tekendra. ""The darkness and despair of people will get to you"". rest of world.{{cite web}}: CS1 maint: url-status (link)
  7. Stone, Brad (July 18, 2010). "Concern for Those Who Screen the Web for Barbarity" via NYTimes.com.
  8. Adrian Chen (23 October 2014). "The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed". WIRED. Archived from the original on 2015-09-13.
  9. "The Internet's Invisible Sin-Eaters". The Awl. Archived from the original on 2015-09-08.
  10. University, Department of Communications and Public Affairs, Western (March 19, 2014). "Western News - Professor uncovers the Internet's hidden labour force". Western News.
  11. "Should Facebook Block Offensive Videos Before They Post?". WIRED. 26 August 2015.
  12. 1 2 Olivia Solon (2017-05-04). "Facebook is hiring moderators. But is the job too gruesome to handle?". The Guardian. Retrieved 2018-09-13.
  13. Olivia Solon (2017-05-25). "Underpaid and overburdened: the life of a Facebook moderator". The Guardian. Retrieved 2018-09-13.
  14. 1 2 Gross, Terry. "For Facebook Content Moderators, Traumatizing Material Is A Job Hazard". NPR.org.
  15. Hartmann, Ivar A. (April 2020). "A new framework for online content moderation". Computer Law & Security Review. 36: 3. doi:10.1016/j.clsr.2019.105376 . Retrieved 25 May 2022.
  16. Grimes-Viort, Blaise (December 7, 2010). "6 types of content moderation you need to know about". Social Media Today.