Content moderation

Last updated
Comment moderation on a GitHub discussion, where a user called Mallory has deleted several comments before closing the discussion and locking it Deleted comments.png
Comment moderation on a GitHub discussion, where a user called Mallory has deleted several comments before closing the discussion and locking it

On websites that allow users to create content, content moderation is the process of detecting contributions that are irrelevant, obscene, illegal, harmful, or insulting, in contrast to useful or informative contributions, frequently for censorship or suppression of opposing viewpoints. The purpose of content moderation is to remove or apply a warning label to problematic content or allow users to block and filter content themselves. [1]

Contents

Various types of Internet sites permit user-generated content such as posts, comments, videos including Internet forums, blogs, and news sites powered by scripts such as phpBB, a Wiki, or PHP-Nuke etc. Depending on the site's content and intended audience, the site's administrators will decide what kinds of user comments are appropriate, then delegate the responsibility of sifting through comments to lesser moderators. Most often, they will attempt to eliminate trolling, spamming, or flaming, although this varies widely from site to site.

Major platforms use a combination of algorithmic tools, user reporting and human review. [1] Social media sites may also employ content moderators to manually flag or remove content flagged for hate speech or other objectionable content. Other content issues include revenge porn, graphic content, child abuse material and propaganda. [1] Some websites must also make their content hospitable to advertisements. [1]

In the United States, content moderation is governed by Section 230 of the Communications Decency Act, and has seen several cases concerning the issue make it to the United States Supreme Court, such as the current Moody v. NetChoice, LLC.

Supervisor moderation

Also known as unilateral moderation, this kind of moderation system is often seen on Internet forums. A group of people are chosen by the site's administrators (usually on a long-term basis) to act as delegates, enforcing the community rules on their behalf. These moderators are given special privileges to delete or edit others' contributions and/or exclude people based on their e-mail address or IP address, and generally attempt to remove negative contributions throughout the community. [2]

Commercial content moderation

Commercial Content Moderation is a term coined by Sarah T. Roberts to describe the practice of "monitoring and vetting user-generated content (UGC) for social media platforms of all types, in order to ensure that the content complies with legal and regulatory exigencies, site/community guidelines, user agreements, and that it falls within norms of taste and acceptability for that site and its cultural context." [3]

Industrial composition

The content moderation industry is estimated to be worth US$9 billion. While no official numbers are provided, there are an estimates 10,000 content moderators for TikTok; 15,000 for Facebook and 1,500 for Twitter as of 2022. [4]

The global value chain of content moderation typically includes social media platforms, large MNE firms and the content moderation suppliers. The social media platforms (e.g Facebook, Google) are largely based in the United States, Europe and China. The MNEs (e.g Accenture, Foiwe) are usually headquartered in the global north or India while suppliers of content moderation are largely located in global southern countries like India and the Philippines. [5] :79–81

While at one time this work may have been done by volunteers within the online community, for commercial websites this is largely achieved through outsourcing the task to specialized companies, often in low-wage areas such as India and the Philippines. Outsourcing of content moderation jobs grew as a result of the social media boom. With the overwhelming growth of users and UGC, companies needed many more employees to moderate the content. In the late 1980s and early 1990s, tech companies began to outsource jobs to foreign countries that had an educated workforce but were willing to work for cheap. [6]

Working conditions

Employees work by viewing, assessing and deleting disturbing content. [7] Wired reported in 2014, they may suffer psychological damage [8] [9] [10] [2] [11] In 2017, the Guardian reported secondary trauma may arise, with symptoms similar to PTSD. [12] Some large companies such as Facebook offer psychological support [12] and increasingly rely on the use of artificial intelligence to sort out the most graphic and inappropriate content, but critics claim that it is insufficient. [13] In 2019, NPR called it a job hazard. [14] Non-disclosure agreements are the norm when content moderators are hired. This makes moderators more hesitant to speak up about working conditions or organize. [4]

Psychological hazards including stress and post-traumatic stress disorder, combined with the precarity of algorithmic management and low wages make content moderation extremely challenging. [15] :123 The number of tasks completed, for example labeling content as copyright violation, deleting a post containing hate-speech or reviewing graphic content are quantified for performance and quality assurance. [4]

In February 2019, an investigative report by The Verge described poor working conditions at Cognizant's office in Phoenix, Arizona. [16] Cognizant employees tasked with content moderation for Facebook developed mental health issues, including post-traumatic stress disorder, as a result of exposure to graphic violence, hate speech, and conspiracy theories in the videos they were instructed to evaluate. [16] [17] Moderators at the Phoenix office reported drug abuse, alcohol abuse, and sexual intercourse in the workplace, and feared retaliation from terminated workers who threatened to harm them. [16] [18] In response, a Cognizant representative stated the company would examine the issues in the report. [16]

The Verge published a follow-up investigation of Cognizant's Tampa, Florida, office in June 2019. [19] [20] Employees in the Tampa location described working conditions that were worse than the conditions in the Phoenix office. [19] [21] [22]

Moderators were required to sign non-disclosure agreements with Cognizant to obtain the job, although three former workers broke the agreements to provide information to The Verge. [19] [23] In the Tampa office, workers reported inadequate mental health resources. [19] [24] As a result of exposure to videos depicting graphic violence, animal abuse, and child sexual abuse, some employees developed psychological trauma and post-traumatic stress disorder. [19] [25] In response to negative coverage related to its content moderation contracts, a Facebook director indicated that Facebook is in the process of developing a "global resiliency team" that would assist its contractors. [19]

Facebook

Facebook had increased the number of content moderators from 4,500 to 7,500 in 2017 due to legal requirements and other controversies. In Germany, Facebook was responsible for removing hate speech within 24 hours of when it was posted. [26] In late 2018, Facebook created an oversight board or an internal "Supreme Court" to decide what content remains and what content is removed. [14]

According to Frances Haugen, the number of Facebook employees responsible for content moderation was much smaller as of 2021. [27]

Twitter

Social media site Twitter has a suspension policy. Between August 2015 and December 2017, it suspended over 1.2 million accounts for terrorist content to reduce the number of followers and amount of content associated with the Islamic State. [28] Following the acquisition of Twitter by Elon Musk in October 2022, content rules have been weakened across the platform in an attempt to prioritize free speech. [29] However, the effects of this campaign have been called into question. [30] [31]

Distributed moderation

User moderation

User moderation allows any user to moderate any other user's contributions. Billions of people are currently making decisions on what to share, forward or give visibility to on a daily basis. [32] On a large site with a sufficiently large active population, this usually works well, since relatively small numbers of troublemakers are screened out by the votes of the rest of the community.

User moderation can also be characterized by reactive moderation. This type of moderation depends on users of a platform or site to report content that is inappropriate and breaches community standards. In this process, when users are faced with an image or video they deem unfit, they can click the report button. The complaint is filed and queued for moderators to look at. [33]

Unionization

150 content moderators, who contracted for Meta, ByteDance and OpenAI gathered in Nairobi, Kenya to launch the first African Content Moderators Union on 1 May 2023. This union was launched 4 years after Daniel Motaung was fired and retaliated against for organizing a union at Sama, which contracts for Facebook. [34]

See also

Related Research Articles

<span class="mw-page-title-main">Accenture</span> Irish-American professional services company

Accenture plc is a global multinational professional services company originating in the United States and headquartered in Dublin, Ireland, that specializes in information technology (IT) services and management consulting. A Fortune Global 500 company, it reported revenues of $64.9 billion in 2024.

Cognizant Technology Solutions Corporation is an American multinational information technology services and consulting company. It is headquartered in Teaneck, New Jersey, U.S. Cognizant is part of the NASDAQ-100 and trades under CTSH. It was founded in Chennai, India, as an in-house technology unit of Dun & Bradstreet in 1994, and started serving external clients in 1996. After a series of corporate reorganizations, there was an initial public offering in 1998. Ravi Kumar S has been the CEO of the company since January 2023, replacing Brian Humphries.

<span class="mw-page-title-main">Facebook</span> Social-networking service owned by Meta Platforms

Facebook is a social media and social networking service owned by American technology conglomerate Meta. Created in 2004 by Mark Zuckerberg with four other Harvard College students and roommates Eduardo Saverin, Andrew McCollum, Dustin Moskovitz, and Chris Hughes, its name derives from the face book directories often given to American university students. Membership was initially limited to Harvard students, gradually expanding to other North American universities. Since 2006, Facebook allows everyone to register from 13 years old, except in the case of a handful of nations, where the age limit is 14 years. As of December 2022, Facebook claimed almost 3 billion monthly active users. As of October 2023, Facebook ranked as the third-most-visited website in the world, with 22.56% of its traffic coming from the United States. It was the most downloaded mobile app of the 2010s.

Facebook has been the subject of criticism and legal action since it was founded in 2004. Criticisms include the outsize influence Facebook has on the lives and health of its users and employees, as well as Facebook's influence on the way media, specifically news, is reported and distributed. Notable issues include Internet privacy, such as use of a widespread "like" button on third-party websites tracking users, possible indefinite records of user information, automatic facial recognition software, and its role in the workplace, including employer-employee account disclosure. The use of Facebook can have negative psychological and physiological effects that include feelings of sexual jealousy, stress, lack of attention, and social media addiction that in some cases is comparable to drug addiction.

<span class="mw-page-title-main">Parler</span> American alt-tech social networking service

Parler is an American alt-tech social networking service associated with conservatives. Launched in August 2018, Parler marketed itself as a free speech-focused and unbiased alternative to mainstream social networks such as Twitter and Facebook. Journalists described Parler as an alt-tech alternative to Twitter, with its users including those banned from mainstream social networks or who oppose their moderation policies.

Social network advertising, also known as social media targeting, is a group of terms used to describe forms of online advertising and digital marketing that focus on social networking services. A significant aspect of this type of advertising is that advertisers can take advantage of users' demographic information, psychographics, and other data points to target their ads.

<span class="mw-page-title-main">Section 230</span> US federal law on website liability

In the United States, Section 230 is a section of the Communications Act of 1934 that was enacted as part of the Communications Decency Act of 1996, which is Title V of the Telecommunications Act of 1996, and generally provides immunity for online computer services with respect to third-party content generated by its users. At its core, Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

<span class="mw-page-title-main">Instagram</span> Social media platform owned by Meta Platforms

Instagram is an American photo and video sharing social networking service owned by Meta Platforms. It allows users to upload media that can be edited with filters, be organized by hashtags, and be associated with a location via geographical tagging. Posts can be shared publicly or with preapproved followers. Users can browse other users' content by tags and locations, view trending content, like photos, and follow other users to add their content to a personal feed. A Meta-operated image-centric social media platform, it is available on iOS, Android, Windows 10, and the web. Users can take photos and edit them using built-in filters and other tools, then share them on other social media platforms like Facebook. It supports 32 languages including English, Hindi, Spanish, French, Korean, and Japanese.

Shadow banning, also called stealth banning, hell banning, ghost banning, and comment ghosting, is the practice of blocking or partially blocking a user or the user's content from some areas of an online community in such a way that the ban is not readily apparent to the user, regardless of whether the action is taken by an individual or an algorithm. For example, shadow-banned comments posted to a blog or media website would be visible to the sender, but not to other users accessing the site.

<span class="mw-page-title-main">Mastodon (social network)</span> Open source, self-hosted, social media service

Mastodon is an open source, self-hosted, social networking service. Mastodon uses the ActivityPub protocol for federation which allows users to communicate between independent Mastodon instances and other ActivityPub compatible services. Mastodon has microblogging features similar to Twitter, and is generally considered to be a part of the Fediverse.

<span class="mw-page-title-main">Censorship by Facebook</span>

Facebook has been involved in multiple controversies involving censorship of content, removing or omitting information from its services in order to comply with company policies, legal demands, and government censorship laws.

BitChute is an alt-tech video hosting service launched by Ray Vahey in January 2017. It describes itself as offering freedom of speech, while the service is known for hosting far-right individuals, conspiracy theorists, and hate speech. Some creators who use BitChute have been banned from YouTube; some others crosspost content to both platforms or post more extreme content only to BitChute. Before its deprecation, BitChute claimed to use peer-to-peer WebTorrent technology for video distribution, though this was disputed.

<span class="mw-page-title-main">Deplatforming</span> Administrative or political action to deny access to a platform to express opinions

Deplatforming, also called no-platforming, is a form of Internet censorship of an individual or group by preventing them from posting on the platforms they use to share their information/ideas. This typically involves suspension, outright bans, or reducing spread.

<span class="mw-page-title-main">Meta Portal</span> Line of smart displays by Facebook

Meta Portal is a discontinued brand of smart displays and videophones released in 2018 by Meta. The product line consists of four models: Portal, Portal+, Portal TV, and Portal Go. These models provide video chat via Messenger and WhatsApp, augmented by a camera that can automatically zoom and track people's movements. The devices are integrated with Amazon's voice-controlled intelligent personal assistant service Alexa.

MeWe is a global social media and social networking service. As a company based in Los Angeles, California it is also known as Sgrouples, Inc., doing business as MeWe. The site has been described as a Facebook alternative due to its focus on data privacy.

<span class="mw-page-title-main">Alphabet Workers Union</span> Trade union of workers

Alphabet Workers Union (AWU), also informally referred to as the Google Union, is an American trade union of workers employed at Alphabet Inc., Google's parent company, with a membership of over 800, in a company with 130,000 employees, not including temps, contractors, and vendors in the United States. It was announced on January 4, 2021 with an initial membership of over 400, after over a year of secret organizing, and the union includes all types of workers at Alphabet, including full-time, temporary, vendors and contractors of all job types.

<span class="mw-page-title-main">Facebook content management controversies</span> Criticism of Facebooks content management

Facebook and Meta Platforms have been criticized for their management of various content on posts, photos and entire groups and profiles. This includes but is not limited to allowing violent content, including content related to war crimes, and not limiting the spread of fake news and COVID-19 misinformation on their platform, as well as allowing incitement of violence against multiple groups.

<i>Moderator Mayhem</i> 2023 browser game

Moderator Mayhem is a casual web-based video game designed by Engine, Randy Lubin, and Mike Masnick of Techdirt targeted towards policymakers. It was published in May 2023. The game is about the challenges of content moderation of user-generated content on social media.

The social media platform Meta Platforms services 3 billion users across its subsidiaries Facebook, Messenger, WhatsApp and Threads. Meta employs an estimated 60–80,000 employees as of 2023. Facebook subcontracts an additional estimated 15,000 content moderators around the world. The majority of unionized workers at Meta in the United States are subcontractors working as security guards, janitors, bus drivers and culinary staff. In Germany and Kenya, content moderators have formed unions and a works council in 2023.

YouTube moderation are a set of community guidelines aimed to reduce abuse of the site's features. The uploading of videos containing defamation, pornography, and material encouraging criminal conduct is forbidden by YouTube's "Community Guidelines". Generally prohibited material includes sexually explicit content, videos of animal abuse, shock videos, content uploaded without the copyright holder's consent, hate speech, spam, and predatory behavior. YouTube relies on its users to flag the content of videos as inappropriate, and a YouTube employee will view a flagged video to determine whether it violates the site's guidelines. Despite the guidelines, YouTube has faced criticism over aspects of its operations, its recommendation algorithms perpetuating videos that promote conspiracy theories and falsehoods, hosting videos ostensibly targeting children but containing violent or sexually suggestive content involving popular characters, videos of minors attracting pedophilic activities in their comment sections, and fluctuating policies on the types of content that is eligible to be monetized with advertising.

References

  1. 1 2 3 4 Grygiel, Jennifer; Brown, Nina (June 2019). "Are social media companies motivated to be good corporate citizens? Examination of the connection between corporate social responsibility and social media safety". Telecommunications Policy. 43 (5): 2, 3. doi:10.1016/j.telpol.2018.12.003. S2CID   158295433 . Retrieved 25 May 2022.
  2. 1 2 "Invisible Data Janitors Mop Up Top Websites - Al Jazeera America". aljazeera.com.
  3. "Behind the Screen: Commercial Content Moderation (CCM)". Sarah T. Roberts | The Illusion of Volition. 2012-06-20. Retrieved 2017-02-03.
  4. 1 2 3 Wamai, Jacqueline Wambui; Kalume, Maureen Chadi; Gachuki, Monicah; Mukami, Agnes (2023). "A new social contract for the social media platforms: prioritizing rights and working conditions for content creators and moderators | International Labour Organization". International Journal of Labour Research . 12 (1–2). International Labour Organization . Retrieved 2024-07-21.
  5. Ahmad, Sana; Krzywdzinski, Martin (2022), Graham, Mark; Ferrari, Fabian (eds.), "Moderating in Obscurity: How Indian Content Moderators Work in Global Content Moderation Value Chains", Digital Work in the Planetary Market, MIT Press, pp. 77–95, ISBN   978-0-262-36982-4 , retrieved 2024-07-22
  6. Elliott, Vittoria; Parmar, Tekendra (22 July 2020). ""The darkness and despair of people will get to you"". rest of world.
  7. Stone, Brad (July 18, 2010). "Concern for Those Who Screen the Web for Barbarity". The New York Times.
  8. Adrian Chen (23 October 2014). "The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed". WIRED. Archived from the original on 2015-09-13.
  9. "The Internet's Invisible Sin-Eaters". The Awl. Archived from the original on 2015-09-08.
  10. "Professor uncovers the Internet's hidden labour force". Western News . March 19, 2014.
  11. "Should Facebook Block Offensive Videos Before They Post?". WIRED. 26 August 2015.
  12. 1 2 Olivia Solon (2017-05-04). "Facebook is hiring moderators. But is the job too gruesome to handle?". The Guardian. Retrieved 2018-09-13.
  13. Olivia Solon (2017-05-25). "Underpaid and overburdened: the life of a Facebook moderator". The Guardian. Retrieved 2018-09-13.
  14. 1 2 Gross, Terry. "For Facebook Content Moderators, Traumatizing Material Is A Job Hazard". NPR.org.
  15. Gillespie, Tarleton (2018). Custodians of the internet: platforms, content moderation, and the hidden decisions that shape social media (PDF). Yale University Press. ISBN   978-0-300-23502-9.
  16. 1 2 3 4 Newton, Casey (25 February 2019). "The secret lives of Facebook moderators in America". The Verge . Archived from the original on 21 February 2021. Retrieved 20 June 2019.
  17. Feiner, Lauren (25 February 2019). "Facebook content reviewers are coping with PTSD symptoms by having sex and doing drugs at work, report says". CNBC . Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  18. Silverstein, Jason (25 February 2019). "Facebook vows to improve content reviewing after moderators say they suffered PTSD". CBS News . Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  19. 1 2 3 4 5 6 Newton, Casey (19 June 2019). "Three Facebook moderators break their NDAs to expose a company in crisis". The Verge . Archived from the original on 12 September 2022. Retrieved 20 June 2019.
  20. Bridge, Mark (20 June 2019). "Facebook worker who died of heart attack was under 'unworldly' pressure" . The Times . ISSN   0140-0460. Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  21. Carbone, Christopher (20 June 2019). "Facebook moderator dies after viewing horrific videos, others share disturbing incidents: report". Fox News . Archived from the original on 21 June 2019. Retrieved 21 June 2019.
  22. Eadicicco, Lisa (19 June 2019). "A Facebook content moderator died after suffering heart attack on the job". San Antonio Express-News . Archived from the original on 30 November 2019. Retrieved 20 June 2019.
  23. Feiner, Lauren (19 June 2019). "Facebook content moderators break NDAs to expose shocking working conditions involving gruesome videos and feces smeared on walls". CNBC . Archived from the original on 19 June 2019. Retrieved 20 June 2019.
  24. Johnson, O’Ryan (19 June 2019). "Cognizant Getting $200M From Facebook To Moderate Violent Content Amid Allegations Of 'Filthy' Work Conditions: Report". CRN . Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  25. Bufkin, Ellie (19 June 2019). "Report reveals desperate working conditions of Facebook moderators — including death". Washington Examiner . Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  26. "Artificial intelligence will create new kinds of work". The Economist. Retrieved 2017-09-02.
  27. Böhmermann, Jan (2021-12-10). Facebook whistleblower Frances Haugen talks about the Facebook Papers. de:ZDF Magazin Royale.
  28. Gartenstein-Ross, Daveed; Koduvayur, Varsha (26 May 2022). "Texas's New Social Media Law Will Create a Haven for Global Extremists". foreignpolicy.com. Foreign Policy. Retrieved 27 May 2022.
  29. "Elon Musk on X: "@essagar Suspending the Twitter account of a major news organization for publishing a truthful story was obviously incredibly inappropriate"". Twitter. Retrieved 2023-08-21.
  30. Burel, Grégoire; Alani, Harith; Farrell, Tracie (2022-05-12). "Elon Musk could roll back social media moderation – just as we're learning how it can stop misinformation". The Conversation. Retrieved 2023-08-21.
  31. Fung, Brian (June 2, 2023). "Twitter loses its top content moderation official at a key moment". CNN News.
  32. Hartmann, Ivar A. (April 2020). "A new framework for online content moderation". Computer Law & Security Review. 36: 3. doi:10.1016/j.clsr.2019.105376. S2CID   209063940 . Retrieved 25 May 2022.
  33. Grimes-Viort, Blaise (December 7, 2010). "6 types of content moderation you need to know about". Social Media Today.
  34. Perrigo, Billy (2023-05-01). "150 AI Workers Vote to Unionize at Nairobi Meeting". Time . Retrieved 2024-07-21.

Further reading