There are reports of TikTok and Douyin censoring political content related to China and other countries as well as content from minority creators. TikTok says that its initial content moderation policies, many of which are no longer applicable, were aimed at reducing divisiveness and were not politically motivated.
TikTok's content moderation policies have been criticized as non-transparent (especially Douyin's). Internal guidelines depending on the country against the promotion of violence, separatism, and "demonization of countries" could be used to prohibit content related to the 1989 Tiananmen Square protests and massacre, Falun Gong, Tibet, Taiwan, Chechnya, Northern Ireland, the Cambodian genocide, the 1998 Indonesian riots, Kurdish nationalism, ethnic conflicts between blacks and whites or between different Islamic sects. A more specific list banned criticism of world leaders, including past and present ones from Russia, the United States, Japan, North and South Korea, India, Indonesia, and Turkey. [1] [2]
In September 2019, The Washington Post reported allegations from former U.S. employees that TikTok censored content sensitive for Beijing as well as political discussions unrelated to China. Topics such as Donald Trump and the 2019–2020 Hong Kong protests were noticeably rarer on TikTok compared to other platforms. TikTok said it would replace its Beijing-based moderation with regional teams operating under greater autonomy in terms of content moderation. [3] [4] On 27 November 2019, TikTok temporarily suspended the account of Feroza Aziz after she posted a video (disguised as a makeup tutorial) which drew attention to the Xinjiang internment camps. [5] [6] TikTok later apologized and claimed that her account, which they soon reinstated, was flagged because of her joke about Osama bin Laden in another post. [7] In July 2020, TikTok took down a video about the Xinjiang internment camps after it gained millions of views. It is available again with over six million views as of June 2024. The video's creator has also reported other instances where she was banned or restricted, including from livestreaming, after speaking on government or politics. [8] [9]
TikTok's policies ban content related to a specific list of foreign leaders such as Vladimir Putin, Donald Trump, Barack Obama, and Mahatma Gandhi because it can stir controversy and attacks on political views. [10] Its policies also ban content critical of Turkish president Recep Tayyip Erdoğan and content considered to be supporting Kurdish nationalism. [11] TikTok was reported to have censored users who were supportive of the Citizenship Amendment Act protests in India and those who promote peace between Hindus and Muslims. [12]
In March 2020, internal documents leaked to The Intercept revealed that moderators had been instructed to censor political speech in livestreams, banning those who harmed "national honor" or who broadcast streams about "state organs such as police". [13] [14] [15] In response to censorship concerns, TikTok's parent company hired K&L Gates, including former U.S. Congressmen Bart Gordon and Jeff Denham, to advise it on its content moderation policies. [16] [17] [18]
In June 2020, The Wall Street Journal reported that some previously non-political TikTok users were airing pro-Beijing views for the explicit purpose of boosting subscribers and avoiding shadow bans. [19] Later that month, The Times of India reported that TikTok was shadow banning videos related to the Sino-Indian border dispute and the China–India skirmishes. [20] In July, the company announced that it was pulling out of Hong Kong in response to the Hong Kong national security law. [21]
ByteDance and TikTok said their early guidelines were global and aimed at reducing online harassment and divisiveness when the platforms were still growing. They have been replaced by versions customized by local teams for users in different regions. The company invited UK lawmakers to examine its algorithm. [22] [23] [24]
In January 2021, TikTok banned Trump-related content deemed to be inciting violence. [25] On 3 February, it received praise from Russian officials due to its co-operation with them in the removal of "forbidden" content, mostly related to protests in Russia. [26] [27] A March 2021 study by the Citizen Lab found that TikTok did not censor searches politically but was inconclusive about whether posts are censored. [28] [29]
In February 2022, German newspaper Frankfurter Allgemeine Zeitung reported that automatic subtitles in videos containing terms such as "reeducation camp," "internment camp," or "labor camp" were replaced with asterisks. [30] TikTok is said to operate a suspicious filtering system in Germany that bans words related to Nazism such as "Auschwitz". [31]
In response to the 2022 Russian invasion of Ukraine, TikTok banned new Russian posts and livestreams. [32] [33] [34] Tracking Exposed, a user data rights group, learned of what was likely a technical glitch that became exploited by pro-Russia posters. It stated that although this and other loopholes were patched by TikTok before the end of March, the initial failure to correctly implement the restrictions, in addition to the effects from Kremlin's "fake news" laws, contributed to the formation of a "splInternet ... dominated by pro-war content" in Russia. [35] [36]
A 2023 paper by the Internet Governance Project at Georgia Institute of Technology concluded that TikTok is "not exporting censorship, either directly by blocking material, or indirectly via its recommendation algorithm." [37]
In March 2023, basketball player Enes Kanter Freedom was banned from TikTok after repeated warnings but subsequently restored when TikTok CEO Shou Zi Chew testified before the U.S. Congress. TikTok said it stands by previous strikes against Freedom but a moderation error had pushed his account over the line leading to a ban. After regaining his account, Freedom said he would continue criticizing the Chinese government on the platform. At the time, TikTok in the United States featured many videos that would have been censored within China, including hashtags such as #Uyghur treatment (278 million views), #TiananmenSquare (18 million views) and #FreeTibet (13 million views). [38] In May, the Acton Institute was suspended after it promoted videos about the imprisonment of Jimmy Lai and the Chinese government's crackdown on the pro-democracy camp in Hong Kong. [39] The suspension raised "deep concern" by lawmakers on the United States House Select Committee on Strategic Competition between the United States and the Chinese Communist Party. [40]
During the Israel–Hamas war, TikTok was accused of refusing to run ads by family members of Israelis taken hostage by Hamas. [41] TikTok was also accused by Malaysia's minister of communications, Fahmi Fadzil of suppressing pro-Palestinian content. The company stated it banned praising Hamas and removed more than 775,000 videos and 14,000 livestreams. [42] [43]
A December 2023 study by Rutgers University researchers working under the name Network Contagion Research Institute (NCRI) found a "strong possibility that content on TikTok is either amplified or suppressed based on its alignment with the interests of the Chinese government." [44] Commenting on the study, The New York Times stated, "[a]lready, there is evidence that China uses TikTok as a propaganda tool. Posts related to subjects that the Chinese government wants to suppress — like Hong Kong protests and Tibet — are strangely missing from the platform." [45] The researchers subsequently found that TikTok removed the ability to analyze hashtags of sensitive topics. [46] TikTok said it restricted the number of hashtags that can be searched under its Creative Center because it was "misused to draw inaccurate conclusions". [47] [48]
A historian from the Cato Institute stated that there were "basic errors" in the Rutgers University study and criticized the uncritical news coverage that followed. The study compares data from before TikTok even existed to show the app has fewer hashtags about historically sensitive topics, distorting the findings. [49] [47]
In August 2024, Bloomberg reported that the Rutgers University NCRI released a new report based on user journey data. [50] By searching for four keywords—Uyghur, Xinjiang, Tibet, and Tiananmen—on TikTok, YouTube, and Instagram, the researchers found that TikTok's algorithm displayed a higher percentage of positive, neutral, or irrelevant content related to China's human rights abuses compared to both Instagram and YouTube. [50] The researchers also found that users spending three hours or more daily on the app were significantly more positive about China's human rights records than non-users. TikTok dismissed NCRI's study, stating it does not reflect the real user experience. [50]
In 2019, The Guardian reported that TikTok's efforts to provide locally-sensitive moderation had resulted in the removal of content that could be perceived as being positive towards LGBTQ+ people or LGBTQ+ rights (such as same-sex couples holding hands) in countries such as Turkey. [11]
In December 2019, TikTok admitted that it aimed to "reduce bullying" in the comments of videos by artificially reducing the viral potential of videos its algorithm identified as being made by LGBTQ+ people. [51] That same month, the German website Netzpolitik.org reported that TikTok also artificially reduced the viral potential of videos its algorithm identified as being made by "fat people [and] people with facial disfigurement, autism, Down syndrome, [or] disabled people or people with some facial problems". Those affected may not have their video shown outside of their native country or have it show up on the "For You" page, TikTok's personalized algorithmic homepage feed. [51] According to The Verge , some lesbians on TikTok refer to themselves jokingly as "le dolla bean", referring to the spelling of "le$bian" used to avoid TikTok removing the video. Hicks told The Verge that "it became this whole joke because things that have the word 'lesbian' in them were either getting flagged for the deletion or causing the users' accounts to get in trouble". [52]
In 2020, TikTok was accused of censoring transgender users following reports of transgender users having videos being removed or muted. Transgender users on TikTok have complained of censorship after their posts were removed. [53] The BBC reported that the LGBTQ+ charity Stonewall stated that such actions had "sent a damaging message to young trans people using the platform for support". TikTok issued a statement claiming that they "categorically do not remove any content based on expression of gender identity". [54]
In September 2020, the Australian Strategic Policy Institute reported that certain LGBTQ+ hashtags have been restricted in Bosnia, Russia, and Jordan. TikTok admitted restricting hashtags in certain countries, citing local laws for some hashtag restrictions and other hashtags due to being primarily used for pornographic use. TikTok also claimed that some hashtags had been moderated by mistake and the issue subsequently fixed and other hashtags alleged to have been censored had never been used by actual video creators yet. [55]
In May 2021, American intersex activist Pidgeon Pagonis reported that the "intersex" hashtag had become unavailable on TikTok for the second time. TikTok told The Verge that the tag had been removed by mistake and was subsequently restored in both instances, which led to public speculation about whether the hashtag was censored. [52]
TikTok has since apologized and instituted a ban against anti-LGBTQ ideology, with the exceptions of places such as China, the Middle East, and parts of Europe where additional censorship laws may apply. [52] [56] [55]
In instances of protesting against acts of racism and racism as a whole, users have felt that there was a change in the popularity of their content, such as their content not showing up as frequently or even at all. [57]
On May 7, 2020, in honor of the upcoming birthday of Malcolm X on May 19, TikTok user Lex Scott encouraged viewers to protest TikTok's suppression of African-American creators by changing their profile pictures to the black power fist symbol, following black creators, and unfollowing creators who did not support the initiative. This was termed the #ImBlackMovement. Thousands of TikTok users followed suit, and the hashtag #BlackVoicesHeard reached over 6 million views by the morning of May 19. [58]
After the murder of George Floyd sparked racial unrest in the United States and protests around the world on May 25, 2020, TikTok creators claimed that TikTok was deliberately suppressing videos that used the hashtags #BlackLivesMatter and #GeorgeFloyd, with these videos appearing to receive no views. TikTok released a statement apologizing for this, claiming that a technical glitch had caused the display error and that the hashtags had received over 2 billion views. [59] Hicks argued that LGBTQ+ people and person of color have found that the guidelines are enforced "wildly differently", meaning their content will be suppressed or removed for supposed violations and that reports of harassment from other users are not acted upon: "Not only is it hurting their ability to speak and be seen on the app, but it's also allowing them to get attacked and have hate speech thrown their way." [52] He told CNN that he welcomed TikTok's public pledge of support to the Black community after the 2020 police killing of George Floyd and that he applied to the company because he felt its corporate value "really resonated with me." [60] The phrase "Black Lives Matter" and several related ones were labeled as inappropriate content. [61]
In 2021, TikTok apologized and vowed to do better after an app called for black creators to be treated more fairly amid accusations of censorship and content suppression was suspended. TikTok has since apologized for racism but many Black creators say little has changed. [62]
According to technology historian Mar Hicks, creators on TikTok feel that they have to be overly cautious about what they post "because the rules change at any given moment [and] there's no transparency". [52] Hicks said that the sudden disappearance of tags, intentional or not, has "incredibly problematic effects and negative effects on communities that are already marginalized and erased". The muddiness around content removal and moderation on TikTok is an ongoing frustration for the app's users. TikTok has community guidelines, but there is no public list of specific words and phrases that are banned, and it is not clear how much moderation is done algorithmically versus by actual people. [52]
China heavily regulates how TikTok's sister app Douyin is used by minors in the country, especially after 2018. [63] Under government pressure, ByteDance introduced parental controls and a "teenage mode" that shows only whitelisted content, such as knowledge sharing, and bans pranks, superstition, dance clubs, and pro-LGBT content. [a] [56]
Google and its subsidiary companies, such as YouTube, have removed or omitted information from its services in order to comply with company policies, legal demands, and government censorship laws.
The Federal Service for Supervision of Communications, Information Technology and Mass Media, abbreviated as Roskomnadzor (RKN), is the Russian federal executive agency responsible for monitoring, controlling and censoring Russian mass media. Its areas of responsibility include electronic media, mass communications, information technology and telecommunications, supervising compliance with the law, protecting the confidentiality of personal data being processed, and organizing the work of the radio-frequency service.
Instagram is an American photo and video sharing social networking service owned by Meta Platforms. It allows users to upload media that can be edited with filters, be organized by hashtags, and be associated with a location via geographical tagging. Posts can be shared publicly or with preapproved followers. Users can browse other users' content by tags and locations, view trending content, like photos, and follow other users to add their content to a personal feed. A Meta-operated image-centric social media platform, it is available on iOS, Android, Windows 10, and the web. Users can take photos and edit them using built-in filters and other tools, then share them on other social media platforms like Facebook. It supports 32 languages including English, Hindi, Spanish, French, Korean, and Japanese.
Shadow banning, also called stealth banning, hell banning, ghost banning, and comment ghosting, is the practice of blocking or partially blocking a user or the user's content from some areas of an online community in such a way that the ban is not readily apparent to the user, regardless of whether the action is taken by an individual or an algorithm. For example, shadow-banned comments posted to a blog or media website would be visible to the sender, but not to other users accessing the site.
WeChat or Weixin in Chinese ; lit. 'micro-message') is a Chinese instant messaging, social media, and mobile payment app developed by Tencent. First released in 2011, it became the world's largest standalone mobile app in 2018 with over 1 billion monthly active users. WeChat has been described as China's "app for everything" and a super-app because of its wide range of functions. WeChat provides text messaging, hold-to-talk voice messaging, broadcast (one-to-many) messaging, video conferencing, video games, mobile payment, sharing of photographs and videos and location sharing.
ByteDance Ltd. is a Chinese internet technology company headquartered in Haidian, Beijing and incorporated in the Cayman Islands.
Xiaohongshu, officially known in English as rednote, is a Chinese social networking and e-commerce platform.
TikTok, known in mainland China and Hong Kong as Douyin, is a Chinese short-form video-hosting service owned by Chinese internet company ByteDance. It hosts user-submitted videos, which may range in duration from three seconds to 60 minutes. It can be accessed with a smartphone app or the web.
Zhang Yiming is a Chinese Internet entrepreneur. He founded ByteDance in 2012, developed the news aggregator Toutiao and the video sharing platform Douyin. Zhang is one of the richest individuals in the world, with an estimated net worth of US$45.6 billion as of October 2024, according to Forbes and US$43.1 billion according to Bloomberg Billionaires Index. On 4 November 2021, Zhang stepped down as CEO of ByteDance, completing a leadership handover announced in May 2021. According to Reuters, Zhang maintains over 50 percent of ByteDance's voting rights. The surging global popularity of TikTok made Zhang the richest man in China in 2024.
Kuaishou Technology is a Chinese publicly traded partly state-owned holding company based in Haidian District, Beijing, that was founded in 2011 by Hua Su (宿华) and Cheng Yixiao (程一笑). The company is known for developing a mobile app for sharing users' short videos, a social network, and video special effects editor.
Chinese censorship abroad refers to extraterritorial censorship by the government of the People's Republic of China, i.e. censorship that is conducted beyond China's own borders. The censorship can be applied to both Chinese expatriates and foreign groups. Sensitive topics that have been censored include the political status of Taiwan, human rights in Tibet, Xinjiang internment camps, the persecution of Uyghurs in China, the 1989 Tiananmen Square protests and massacre, the 2019–2020 Hong Kong protests, the COVID-19 pandemic in mainland China, the PRC government's COVID-19 pandemic response, the persecution of Falun Gong, and more general issues related to human rights and democracy in China.
Pacific Ocean
In 2020, the United States government announced that it was considering banning the Chinese social media platform TikTok upon a request from then-president Donald Trump, who viewed the app as a national security threat. The result was that TikTok owner ByteDance—which initially planned on selling a small portion of TikTok to an American company—agreed to divest TikTok to prevent a ban in the United States and in other countries where restrictions are also being considered due to privacy concerns, which themselves are mostly related to its ownership by a firm based in China.
YouTube Shorts, created in 2020, is the short-form section of the online video-sharing platform YouTube.
The short-form video-hosting service TikTok has been under a de jure nationwide ban in the United States since January 19, 2025, due to the US government's concerns over potential user data collection and influence operations by the government of the People's Republic of China. The ban took effect after ByteDance, the China-based parent company of TikTok, failed to sell the service before the deadline of the Protecting Americans from Foreign Adversary Controlled Applications Act. Prior to the ban, individual states, cities, universities, and government-affiliated devices had restricted TikTok.
Algospeak is the use of coded expressions to evade automated moderation algorithms on social media platforms such as TikTok and YouTube. It is used to discuss topics deemed sensitive to moderation algorithms while avoiding penalties such as shadow banning or downranking of content. It is a type of internet slang and a form of linguistic self-censorship.
The Protecting Americans from Foreign Adversary Controlled Applications Act (PAFACA) is an act of Congress that was signed into law on April 24, 2024, as part of Public Law 118-50. It would ban social networking services within 270 days if they are determined by the president of the United States and relevant provisions to be a "foreign adversary controlled application", with a possible extension of up to 90 days to be granted by the president; the definition covers websites and application software, including mobile apps. The act explicitly applies to ByteDance Ltd. and its subsidiaries—including TikTok—without the need for additional determination, with the company to become compliant by January 19, 2025. It ceases to be applicable if the foreign adversary controlled application is divested and no longer considered to be controlled by a foreign adversary of the United States.
TikTok has been involved in a number of lawsuits since its founding, with a number of them relating to TikTok's data collection techniques.
Clapper is an American short-form video-hosting service headquartered in Dallas, Texas. It was founded in 2020 by Edison Chen as an alternative for TikTok for mature audiences. The app is functionally similar to TikTok and includes tipping and e-commerce features.
Notably absent from the list is Xi Jinping
However, we want to be absolutely clear that even in those early policies, there was never a policy around the Uighur community, which is where I misspoke.
The previous content policy, which TikTok retired over a year ago, did not make reference to the Uyghurs, according to TikTok
cannot be solely attributed to TikTok's content restriction policies. The 'fake news' law ... is likely to have also increased the level of self-censorship ... likely to be a technical glitch ... these loopholes and tried to patch them
{{cite web}}
: CS1 maint: multiple names: authors list (link)