Censorship on TikTok affects material published by people on the Chinese social media platform TikTok. There is evidence that TikTok has down-weighted the posts of political dissidents, LGBTQ+ people, disabled people, and certain African-American hashtags. TikTok's explanations for this vary, ranging from attempting to protect users from bullying [1] to algorithmic mistakes. [2]
In January 2019, the Chinese government said that it would start to hold app developers like ByteDance responsible for user content shared via apps such as Douyin (the name of TikTok in China), [3] and listed 100 types of content that it would censor. [4] It was reported that certain content considered unfavorable to the Chinese Communist Party was already limited for users outside of China, such as content related to the 2019–20 Hong Kong protests or Tibetan independence. [5] TikTok has blocked videos about human rights in China, particularly those that reference Xinjiang internment camps and the Uyghur genocide, and disabled the accounts of users who post them. [6] [7] [8] [9] A 2019 article by The Washington Post reported allegations from former U.S. employees that TikTok had received commands to remove content that Beijing-based teams had deemed subversive or controversial, although ByteDance claimed that no moderators for the U.S. service had been based in China. [10] On 27 November 2019, TikTok temporarily suspended the account of 17-year-old Afghan-American user Feroza Aziz after she posted a video (disguised as a makeup tutorial) which drew attention to the aforementioned Xinjiang internment camps. [11] TikTok later apologized and claimed that her account, which they soon reinstated, had been suspended as a result of "human error". [12] In July 2020, TikTok suspended the account of another user whose viral video called attention to the same issue. [9]
TikTok's policies ban content related to a specific list of foreign leaders such as Vladimir Putin, Donald Trump, Barack Obama, and Mahatma Gandhi because it can stir controversy and attacks on political views. [13] Its policies also ban content critical of Turkish president Recep Tayyip Erdoğan and content considered to be supporting Kurdish nationalism. [14] TikTok was reported to have censored users who were supportive of the Citizenship Amendment Act protests in India and those who promote peace between Hindus and Muslims. [15]
In March 2020, internal documents leaked to The Intercept revealed that moderators had been instructed to suppress posts created by users deemed "too ugly, poor, or disabled" for the platform and to censor political speech in livestreams, banning those who harmed "national honor" or who broadcast streams about "state organs such as police". [16] [17] [18] In response to censorship concerns, TikTok's parent company hired K&L Gates, including former U.S. Congressmen Bart Gordon and Jeff Denham, to advise it on its content moderation policies. [19] [20] TikTok also hired the lobbying firm Monument Advocacy. [21]
In June 2020, The Wall Street Journal reported that some previously non-political TikTok users were airing pro-Beijing views for the explicit purpose of boosting subscribers and avoiding shadow bans. [22] Later that month, The Times of India reported that TikTok was shadow banning videos related to the Sino-Indian border dispute and the China–India skirmishes. [23] In July, the company announced that it was pulling out of Hong Kong in response to the Hong Kong national security law. [24]
In November 2020, a former TikTok executive told a British parliamentary committee that TikTok censored content critical of China, particularly content related to the Uyghur genocide. [25]
In January 2021, TikTok banned Trump-related content deemed to be inciting violence. [26] On 3 February, it received praise from Russian officials due to its co-operation with them in the removal of "forbidden" content, mostly related to protests in Russia. In particular, as media censorship agency Roskomnadzor official Evgeniy Zaitsev stated, "We need to highlight TikTok among other social media platforms because it has office in Russia and actively cooperated with us, which cannot be said about others." [27] Also, the State Duma deputy Alexander Khinshtein said that TikTok's new "anti-fake news" policies go well with the ideology of Russian content censorship law and "should be considered a very positive signal". [28]
In February 2022, German newspaper Frankfurter Allgemeine Zeitung reported that automatic subtitles in videos containing terms such as "reeducation camp," "internment camp," or "labor camp" were replaced with asterisks. [29]
Many uncensored videos give users the option to designate their content as 18+ in order to warn others about content of an adult nature.
In response to the 2022 Russian invasion of Ukraine, TikTok banned new Russian posts and livestreams. [30] [31] [32] However a study by Tracking Exposed found out that TikTok had blocked all non-Russian content, but has continued to host old videos uploaded by Russia-based accounts and permitted Russian state media to continue posting, described as establishing a “splinternet” within a global social media platform. [30] TikTok's vague censorship has permitted pro-Kremlin news but blocked foreign accounts and critics of the war, as a result "Russians are left with a frozen TikTok, dominated by pro-war content". [33] [34]
According to technology historian Mar Hicks, creators on TikTok feel that they have to be overly cautious about what they post "because the rules change at any given moment [and] there's no transparency". [35] Hicks said that the sudden disappearance of tags, intentional or not, has "incredibly problematic effects and negative effects on communities that are already marginalized and erased". [35] The muddiness around content removal and moderation on TikTok is an ongoing frustration for the app's users. [35] TikTok has community guidelines, but there is no public list of specific words and phrases that are banned, and it is not clear how much moderation is done algorithmically versus by actual people. [35] In instances of protesting against acts of racism and racism as a whole, users have felt that there was a change in the popularity of their content, such as their content not showing up as frequently or even at all. [36]
In 2019, The Guardian reported that TikTok's efforts to provide locally-sensitive moderation had resulted in the removal of content that could be perceived as being positive towards LGBTQ+ people or LGBTQ+ rights (such as same-sex couples holding hands) in countries such as Turkey. [14]
In December 2019, TikTok admitted that it aimed to "reduce bullying" in the comments of videos by artificially reducing the viral potential of videos its algorithm identified as being made by LGBTQ+ people. [1] That same month, the German website Netzpolitik.org reported that TikTok also artificially reduced the viral potential of videos its algorithm identified as being made by "fat people [and] people with facial disfigurement, autism, Down syndrome, [or] disabled people or people with some facial problems". Those affected may not have their video shown outside of their native country or have it show up on the "For You" page, TikTok's personalized algorithmic homepage feed. [1] According to The Verge , some lesbians on TikTok refer to themselves jokingly as "le dolla bean", referring to the spelling of "le$bian" used to avoid TikTok removing the video. Hicks told The Verge that "it became this whole joke, because things that have the word 'lesbian' in them were either getting flagged for deletion or causing the users' accounts to get in trouble". [37]
In 2020, TikTok was accused of censoring transgender users following reports of transgender users having videos being removed or muted. The BBC reported that the LGBTQ+ charity Stonewall stated that such actions had "sent a damaging message to young trans people using the platform for support". TikTok issued a statement claiming that they "categorically do not remove any content on the basis of expression of gender identity". [38]
In September 2020, the Australian Strategic Policy Institute reported that certain LGBTQ+ hashtags has been restricted in Bosnia, Russia, and Jordan. TikTok admitted restricting hashtags in certain countries, citing local laws for some hashtag restrictions and other hashtags due to being primarily used for pornographic use. TikTok also claimed that some hashtags had been moderated by mistake and the issue subsequently fixed, and that some of the hashtags alleged to have been censored had never been used by video creators. [39]
In May 2021, American intersex activist Pidgeon Pagonis reported that the "intersex" hashtag had become unavailable on TikTok for the second time. TikTok told The Verge that the tag had been removed by mistake and was subsequently restored in both instances, which led to public speculation about whether the hashtag was censored. [35]
On May 7, 2020, in honor of the upcoming birthday of Malcolm X on May 19, TikTok user Lex Scott encouraged viewers to protest TikTok's suppression of African-American creators by changing their profile pictures to the black power fist symbol, following black creators, and unfollowing creators who did not support the initiative. This was termed the #ImBlackMovement. Thousands of TikTok users followed suit, and the hashtag #BlackVoicesHeard reached over 6 million views by the morning of May 19. [40]
After the murder of George Floyd sparked racial unrest in the United States and protests around the world on May 25, 2020, TikTok creators claimed that TikTok was deliberately suppressing videos that used the hashtags #BlackLivesMatter and #GeorgeFloyd, with these videos appearing to receive no views. TikTok released a statement apologizing for this, claiming that a technical glitch had caused the display error and that the hashtags had actually received over 2 billion views. [2] Hicks argued that LGBTQ+ people and people of color have found that the guidelines are enforced "wildly differently", meaning their content will be suppressed or removed for supposed violations and that reports of harassment from other users are not acted upon: "Not only is it hurting their ability to speak and be seen on the app, but it's also allowing them to get attacked and have hate speech thrown their way." [35]
Censorship in the People's Republic of China (PRC) is implemented or mandated by the PRC's ruling party, the Chinese Communist Party (CCP). It is one of strictest censorship regimes in the world. The government censors content for mainly political reasons, such as curtailing political opposition, and censoring events unfavorable to the CCP, such as the 1989 Tiananmen Square protests and massacre, pro-democracy movements in China, the Uyghur genocide, human rights in Tibet, the Taiwan independence movement, Falun Gong, and pro-democracy protests in Hong Kong. Since Xi Jinping became the General Secretary of the Chinese Communist Party in 2012, censorship has been "significantly stepped up".
Google and its subsidiary companies, such as YouTube, have removed or omitted information from its services in order to comply with company policies, legal demands, and government censorship laws.
Internet censorship in India is done by both central and state governments. DNS filtering and educating service users in suggested usages is an active strategy and government policy to regulate and block access to Internet content on a large scale. Also measures for removing content at the request of content creators through court orders have become more common in recent years. Initiating a mass surveillance government project like Golden Shield Project is also an alternative discussed over the years by government bodies.
The Federal Service for Supervision of Communications, Information Technology and Mass Media, abbreviated as Roskomnadzor (RKN), is the Russian federal executive agency responsible for monitoring, controlling and censoring Russian mass media. Its areas of responsibility include electronic media, mass communications, information technology and telecommunications, supervising compliance with the law, protecting the confidentiality of personal data being processed, and organizing the work of the radio-frequency service.
Facebook is a social networking service that has been gradually replacing traditional media channels since 2010. Facebook has limited moderation of the content posted to its site. Because the site indiscriminately displays material publicly posted by users, Facebook can, in effect, threaten oppressive governments. Facebook can simultaneously propagate fake news, hate speech, and misinformation, thereby undermining the credibility of online platforms and social media.
Censorship of Twitter refers to Internet censorship by governments that block access to Twitter. Twitter censorship also includes governmental notice and take down requests to Twitter, which Twitter enforces in accordance with its Terms of Service when a government or authority submits a valid removal request to Twitter indicating that specific content is illegal in their jurisdiction.
Instagram is a photo and video sharing social networking service owned by American company Meta Platforms. The app allows users to upload media that can be edited with filters and organized by hashtags and geographical tagging. Posts can be shared publicly or with preapproved followers. Users can browse other users' content by tag and location, view trending content, like photos, and follow other users to add their content to a personal feed.
WeChat is a Chinese instant messaging, social media, and mobile payment app developed by Tencent. First released in 2011, it became the world's largest standalone mobile app in 2018, with over 1 billion monthly active users. WeChat has been described as China's "app for everything" and a super-app because of its wide range of functions. WeChat provides text messaging, hold-to-talk voice messaging, broadcast (one-to-many) messaging, video conferencing, video games, sharing of photographs and videos and location sharing.
Facebook has been involved in multiple controversies involving censorship of content, removing or omitting information from its services in order to comply with company policies, legal demands, and government censorship laws.
ByteDance Ltd. is a Chinese internet technology company headquartered in Beijing and incorporated in the Cayman Islands.
TikTok, known in China as Douyin, is a short-form video hosting service owned by the Chinese company ByteDance. It hosts a variety of user-submitted videos, from content such as pranks, stunts, tricks, jokes and dance, with durations from 15 seconds to ten minutes.
MeWe is a global social media and social networking service owned by Sgrouples, a company based in Los Angeles, California. The site's interface has been described as similar to that of Facebook, although the service describes itself as the "anti-Facebook" due to its focus on data privacy.
Kuaishou is a users short video-sharing mobile app, a social network, and video special effects editor, based in Haidian District (Beijing), developed in 2011 by Beijing Kuaishou Technology, by engineers Hua Su (宿华) and Cheng Yixiao (程一笑).
BigoLive is a live streaming platform owned by a Singapore-based Bigo Technology, which was founded in 2014 by David Li and Jason Hu. As of 2019, Bigo Technology is owned by JOYY, a Chinese company listed on the NASDAQ.
Chinese censorship abroad refers to extraterritorial censorship by the government of the People's Republic of China, i.e. censorship that is conducted beyond China's own borders. The censorship can be applied to both Chinese expatriates and foreign groups. Censored topics include the political status of Taiwan, human rights in Tibet, Xinjiang internment camps, the Uyghur genocide, the 1989 Tiananmen Square protests and massacre, the 2019–2020 Hong Kong protests, the COVID-19 pandemic in mainland China, the PRC government's COVID-19 pandemic response, the persecution of Falun Gong, and more general issues related to human rights and democracy in China.
Multiple governmental agencies and private business have imposed or attempted impose bans on the social media service TikTok. Countries like India and the United States have expressed concerns about the app's ownership by the Chinese company, ByteDance, attempting to ban it from app stores. Countries such as Indonesia and Bangladesh have banned it on the basis of pornography-related concerns, while others like Armenia and Azerbaijan have implemented restrictions to mitigate the spread of information which could lead to conflict. Syria has banned it allegedly due to human trafficking into Europe and other countries via its shared border with Turkey.
In 2020, the U.S. government announced that it was considering banning the Chinese social media platform TikTok upon a request from then-U.S. president Donald Trump, who viewed the app as a national security threat. The result was that TikTok owner ByteDance—which initially planned on selling a small portion of TikTok to an American company—agreed to divest TikTok to prevent a ban in the United States and in other countries where restrictions are also being considered due to privacy concerns, which themselves are mostly related to its ownership by a firm based in China.
A devious lick, also known as a diabolical lick, dastardly lick, or nefarious lick, amongst other names, was a viral 2021 TikTok challenge in which American middle or high school students posted videos of themselves stealing, vandalizing, or showing off one or more items they stole in their school, typically from a bathroom. The trend has resulted in the arrests of many students across the United States. It also allegedly spread to some schools in Latin America, England, Germany and Australia.
YouTube Shorts is a short-form video-sharing platform offered by YouTube. The platform hosts user content much like YouTube's primary service, but limits pieces to 60 seconds in length. Since its launch, YouTube Shorts has accumulated over 5 trillion views.