Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Algorithms record user interactions, from likes/dislikes to amount of time spent on posts, to generate endless media aimed to keep users engaged. Through echo chamber channels, the consumer is driven to be more polarized through preferences in media and self-confirmation. [1] [2] [3] [4]
Algorithmic radicalization remains a controversial phenomenon as it is often not in the best interest of social media companies to remove echo chamber channels. [5] [6] Though social media companies have admitted to algorithmic radicalization's existence, it remains unclear how each will manage this growing threat.
Social media platforms learn the interests and likes of the user to modify their experiences in their feed to keep them engaged and scrolling. An echo chamber is formed when users come across beliefs that magnify or reinforce their thoughts and form a group of like-minded users in a closed system. [7] The issue with echo chambers is that it spreads information without any opposing beliefs and can possibly lead to confirmation bias. According to a group polarization theory, an echo chamber can potentially lead users and groups towards more extreme radicalized positions. [8] According to the National Library of Medicine, "Users online tend to prefer information adhering to their worldviews, ignore dissenting information, and form polarized groups around shared narratives. Furthermore, when polarization is high, misinformation quickly proliferates." [9]
Facebook's algorithm focuses on recommending content that makes the user want to interact. They rank content by prioritizing popular posts by friends, viral content, and sometimes divisive content. Each feed is personalized to the user's specific interests which can sometimes lead users towards an echo chamber of troublesome content. [10] Users can find their list of interests the algorithm uses by going to the "Your ad Preferences" page. According to a Pew Research study, 74% of Facebook users did not know that list existed until they were directed towards that page in the study. [11] It is also relatively common for Facebook to assign political labels to their users. In recent years,[ when? ] Facebook has started using artificial intelligence to change the content users see in their feed and what is recommended to them. A document known as The Facebook Files has revealed that their AI system prioritizes user engagement over everything else. The Facebook Files has also demonstrated that controlling the AI systems has proven difficult to handle. [12]
In an August 2019 internal memo leaked in 2021, Facebook has admitted that "the mechanics of our platforms are not neutral", [13] [14] concluding that in order to reach maximum profits, optimization for engagement is necessary. In order to increase engagement, algorithms have found that hate, misinformation, and politics are instrumental for app activity. [15] As referenced in the memo, "The more incendiary the material, the more it keeps users engaged, the more it is boosted by the algorithm." [13] According to a 2018 study, "false rumors spread faster and wider than true information... They found falsehoods are 70% more likely to be retweeted on Twitter than the truth, and reach their first 1,500 people six times faster. This effect is more pronounced with political news than other categories." [16]
YouTube has been around since 2005 and has more than 2.5 billion monthly users. YouTube discovery content systems focus on the user's personal activity (watched, favorites, likes) to direct them to recommended content. YouTube's algorithm is accountable for roughly 70% of users' recommended videos and what drives people to watch certain content. [17] According to a new study, users have little power to keep unsolicited videos out of their suggested recommended content. This includes videos about hate speech, livestreams, etc. [17]
YouTube has been identified as an influential platform for spreading radicalized content. Al-Qaeda and similar extremist groups have been linked to using YouTube for recruitment videos and engaging with international media outlets. In a research study published by the American Behavioral Scientist Journal, they researched "whether it is possible to identify a set of attributes that may help explain part of the YouTube algorithm's decision-making process". [18] The results of the study showed that YouTube's algorithm recommendations for extremism content factor into the presence of radical keywords in a video's title. In February 2023, in the case of Gonzalez v. Google, the question at hand is whether or not Google, the parent company of YouTube, is protected from lawsuits claiming that the site's algorithms aided terrorists in recommending ISIS videos to users. Section 230 is known to generally protect online platforms from civil liability for the content posted by its users. [19]
TikTok is an app that recommends videos to a user's 'For You Page' (FYP), making every users' page different. With the nature of the algorithm behind the app, TikTok's FYP has been linked to showing more explicit and radical videos over time based on users' previous interactions on the app. [20] Since TikTok's inception, the app has been scrutinized for misinformation and hate speech as those forms of media usually generate more interactions to the algorithm. [21]
As of 2022, TikTok's head of US Security has put out a statement that "81,518,334 videos were removed globally between April - June for violating our Community Guidelines or Terms of Service" to cut back on hate speech, harassment, and misinformation. [22]
The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups. [23] [24] This process is most commonly associated with and has been documented on the video platform YouTube, and is largely faceted by the method in which algorithms on various social media platforms function through the process recommending content that is similar to what users engage with, but can quickly lead users down rabbit-holes. [24] [25] [26]
Many political movements have been associated with the pipeline concept. The intellectual dark web, [24] libertarianism, [27] the men's rights movement, [28] and the alt-lite movement [24] have all been identified as possibly introducing audiences to alt-right ideas. Audiences that seek out and are willing to accept extreme content in this fashion typically consist of young men, commonly those that experience significant loneliness and seek belonging or meaning. [29] In an attempt to find community and belonging, message boards that are often proliferated with hard right social commentary, such as 4chan and 8chan, have been well documented in their importance in the radicalization process. [30]
The alt-right pipeline may be a contributing factor to domestic terrorism. [31] [32] Many social media platforms have acknowledged this path of radicalization and have taken measures to prevent it, including the removal of extremist figures and rules against hate speech and misinformation. [25] [29] Left-wing movements, such as BreadTube, also oppose the alt-right pipeline and "seek to create a 'leftist pipeline' as a counterforce to the alt-right pipeline." [33]
The effects of YouTube's algorithmic bias in radicalizing users has been replicated by one study, [24] [34] [35] [36] although two other studies found little or no evidence of a radicalization process. [25] [37] [38]The U.S. department of Justice defines 'Lone-wolf' (self) terrorism as "someone who acts alone in a terrorist attack without the help or encouragement of a government or a terrorist organization". [39] Through social media outlets on the internet, 'Lone-wolf' terrorism has been on the rise, being linked to algorithmic radicalization. [40] Through echo-chambers on the internet, viewpoints typically seen as radical were accepted and quickly adopted by other extremists. [41] These viewpoints are encouraged by forums, group chats, and social media to reinforce their beliefs. [42]
The Social Dilemma is a 2020 docudrama about how algorithms behind social media enables addiction, while possessing abilities to manipulate people's views, emotions, and behavior to spread conspiracy theories and disinformation. The film repeatedly uses buzz words such as 'echo chambers' and 'fake news' to prove psychological manipulation on social media, therefore leading to political manipulation. In the film, Ben falls deeper into a social media addiction as the algorithm found that his social media page has a 62.3% chance of long-term engagement. This leads into more videos on the recommended feed for Ben and he eventually becomes more immersed into propaganda and conspiracy theories, becoming more polarized with each video.
In the Communications Decency Act, Section 230 states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider". [43] Section 230 protects the media from liabilities or being sued of third-party content, such as illegal activity from a user. [43] However, critics argue that this approach reduces a company's incentive to remove harmful content or misinformation, and this loophole has allowed social media companies to maximize profits through pushing radical content without legal risks. [44] This claim has itself been criticized by proponents of Section 230, who believe that the Good Samaritan clause in subsection (c)(2) enables websites to moderate in the first place, and that prior to its passing, courts had ruled in Stratton Oakmont, Inc. v. Prodigy Services Co. that moderation in any capacity introduces a liability to content providers as "publishers" of the content they chose to leave up. [45]
Lawmakers have drafted legislation that would weaken or remove Section 230 protections over algorithmic content. House Democrats Anna Eshoo, Frank Pallone Jr., Mike Doyle, and Jan Schakowsky introduced the "Justice Against Malicious Algorithms Act" in October 2021 as H.R. 5596. The bill died in committee, [46] but it would have removed Section 230 protections for service providers related to personalized recommendation algorithms that present content to users if those algorithms knowingly or recklessly deliver content that contributes to physical or severe emotional injury. [47]
YouTube is an American online video sharing and social media platform owned by Google. Accessible worldwide, it was launched on February 14, 2005, by Steve Chen, Chad Hurley, and Jawed Karim, three former employees of PayPal. Headquartered in San Bruno, California, United States, it is the second most visited website in the world, after Google Search. YouTube has more than 2.5 billion monthly users, who collectively watch more than one billion hours of videos every day. As of May 2019, videos were being uploaded to the platform at a rate of more than 500 hours of content per minute, and as of 2021, there were approximately 14 billion videos in total.
The Center for Countering Digital Hate (CCDH), formerly Brixton Endeavors, is a British not-for-profit NGO company with offices in London and Washington, D.C. with the stated purpose of stopping the spread of online hate speech and disinformation. It campaigns to deplatform people that it believes promote hate or misinformation, and campaigns to restrict media organisations such as The Daily Wire from advertising. CCDH is a member of the Stop Hate For Profit coalition.
The Institute for Strategic Dialogue (ISD) is a political advocacy organization founded in 2006 by Sasha Havlicek and George Weidenfeld and headquartered in London, United Kingdom.
The American online video sharing and social media platform YouTube has had social impact in many fields, with some individual videos of the site having directly shaped world events. It is the world's largest video hosting website and second most visited website according to both Alexa Internet and Similarweb, and used by 81% of U.S. adults.
In news media and social media, an echo chamber is an environment or ecosystem in which participants encounter beliefs that amplify or reinforce their preexisting beliefs by communication and repetition inside a closed system and insulated from rebuttal. An echo chamber circulates existing views without encountering opposing views, potentially resulting in confirmation bias. Echo chambers may increase social and political polarization and extremism. On social media, it is thought that echo chambers limit exposure to diverse perspectives, and favor and reinforce presupposed narratives and ideologies.
Viral phenomena or viral sensation are objects or patterns that are able to replicate themselves or convert other objects into copies of themselves when these objects are exposed to them. Analogous to the way in which viruses propagate, the term viral pertains to a video, image, or written content spreading to numerous online users within a short time period. This concept has become a common way to describe how thoughts, information, and trends move into and through a human population.
Gab is an American alt-tech microblogging and social networking service known for its far-right userbase. Widely described as a haven for neo-Nazis, racists, white supremacists, white nationalists, antisemites, the alt-right, supporters of Donald Trump, conservatives, right-libertarians, and believers in conspiracy theories such as QAnon, Gab has attracted users and groups who have been banned from other social media platforms and users seeking alternatives to mainstream social media platforms. Founded in 2016 and launched publicly in May 2017, Gab claims to promote free speech, individual liberty, the "free flow of information online", and Christian values. Researchers and journalists have characterized these assertions as an obfuscation of its extremist ecosystem. Antisemitism is prominent in the site's content and the company itself has engaged in antisemitic commentary. Gab CEO Andrew Torba has promoted the white genocide conspiracy theory. Gab is based in Pennsylvania.
Online youth radicalization is the action in which a young individual or a group of people come to adopt increasingly extreme political, social, or religious ideals and aspirations that reject, or undermine the status quo or undermine contemporary ideas and expressions of a state, which they may or may not reside in. Online youth radicalization can be both violent or non-violent.
BitChute is an alt-tech video hosting service launched by Ray Vahey in January 2017. It describes itself as offering freedom of speech, while the service is known for hosting far-right individuals, conspiracy theorists, and hate speech. Some creators who use BitChute have been banned from YouTube; some others crosspost content to both platforms or post more extreme content only to BitChute. Before its deprecation, BitChute claimed to use peer-to-peer WebTorrent technology for video distribution, though this was disputed.
Deplatforming, also known as no-platforming, is a boycott on an individual or group by removing the platforms used to share their information or ideas. The term is commonly associated with social media.
TikTok, whose mainland Chinese counterpart is Douyin, is a short-form video hosting service owned by Chinese internet company ByteDance. It hosts user-submitted videos, which can range in duration from three seconds to 10 minutes. It can be accessed with a smart phone app.
Social media use in politics refers to the use of online social media platforms in political processes and activities. Political processes and activities include all activities that pertain to the governance of a country or area. This includes political organization, global politics, political corruption, political parties, and political values. The media's primary duty is to present us with information and alert us when events occur. This information may affect what we think and the actions we take. The media can also place pressure on the government to act by signaling a need for intervention or showing that citizens want change.
Social media became an active place to interact during the COVID-19 pandemic following the onset of social distancing. Overall messaging rates had risen up by above 50%, according to a study by Facebook's analytics department. Individuals at home used social media to maintain their relationships and access entertainment to pass time faster.
BreadTube or LeftTube is a loose and informal group of online content creators who create video content, including video essays and livestreams, from socialist, social democratic, communist, anarchist, and other left-wing perspectives. BreadTube creators generally post videos on YouTube that are discussed on other online platforms, such as Reddit.
Alt-tech are social media platforms and Internet service providers that have become popular among the alt-right, far-right, and others who espouse extremism or fringe theories, often because they employ less stringent content moderation than mainstream platforms. The term "alt-tech" is a portmanteau of "alt-right" and "Big Tech". In the 2010s, some prominent conservatives and their supporters began to use alt-tech platforms because they had been banned from other social media platforms. Alt-tech platforms describe themselves as protectors of free speech and individual liberty, which researchers and journalists have alleged may be a cover for antisemitism and terrorism.
Rumble is an online video platform, web hosting, and cloud services business headquartered in Toronto, Ontario, with its U.S. headquarters in Longboat Key, Florida. It was founded in 2013 by Chris Pavlovski, a Canadian technology entrepreneur. Rumble's cloud services business hosts Truth Social, and the video platform is popular among American right and far-right users. Rumble has been described as "alt-tech".
Alternative Influence: Broadcasting the Reactionary Right on YouTube is a 2018 report by researcher Rebecca Lewis published at the think tank Data & Society that performs network analysis on a collection of 65 political influencers on 81 YouTube channels. Lewis argues that this network propagates right-wing ideology.
The alt-right pipeline is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups. This process is most commonly associated with and has been documented on the video platform YouTube, and is largely faceted by the method in which algorithms on various social media platforms function through the process recommending content that is similar to what users engage with, but can quickly lead users down rabbit-holes.
Rage-baiting or rage-farming is internet slang that refers to a manipulative tactic to elicit outrage with the goal of increasing internet traffic, online engagement, revenue and support. Rage baiting or farming can be used as a tool to increase engagement, attract subscribers, followers, and supporters, which can be financially lucrative. Rage baiting and rage farming manipulates users to respond in kind to offensive, inflammatory headlines, memes, tropes, or comments.
In the early 21st century, antisemitism was identified in social media platforms with up to 69 percent of Jews in the US having encountered antisemitism online according to the 2022 report released by "The State of Antisemitism in America". Jews have encountered antisemitism either as targets themselves or by being exposed to antisemitic content on their media page.