Type of business | Subsidiary |
---|---|
Type of site | Online video platform |
Founded | February 14, 2005 |
Headquarters | 901 Cherry Avenue San Bruno, California, United States |
Area served | Worldwide (excluding blocked countries) |
Owner | Google LLC |
Founder(s) | |
Key people |
|
Industry | |
Products | |
Revenue | US$31.5 billion (2023) [1] |
Parent | Google LLC (2006–present) |
URL | youtube (see list of localized domain names) |
Advertising | Google AdSense |
Registration | Optional
|
Users | 2.7 billion MAU (January 2024) [2] |
Launched | December 15, 2005 |
Current status | Active |
Content license | Uploader holds copyright (standard license); Creative Commons can be selected. |
Written in | Python (core/API), [3] C (through CPython), C++, Java (through Guice platform), [4] [5] Go, [6] JavaScript (UI) |
YouTube is an American online video sharing platform owned by Google. YouTube was founded on February 14, 2005, by Steve Chen, Chad Hurley, and Jawed Karim, three former employees of PayPal. Headquartered in San Bruno, California, United States, it is the second-most visited website in the world, after Google Search. In January 2024, YouTube had more than 2.7 billion monthly active users, who collectively watched more than one billion hours of videos every day. [7] As of May 2019 [update] , videos were being uploaded to the platform at a rate of more than 500 hours of content per minute, [8] [9] and as of 2023 [update] , there were approximately 14 billion videos in total. [10]
On the 9th of October 2006, YouTube was purchased by Google for $1.65 billion (equivalent to $2.31 billion in 2023). [11] Google expanded YouTube's business model of generating revenue from advertisements alone, to offering paid content such as movies and exclusive content produced by and for YouTube. It also offers YouTube Premium, a paid subscription option for watching content without ads. YouTube incorporated Google's AdSense program, generating more revenue for both YouTube and approved content creators. In 2023, YouTube's advertising revenue totaled $31.7 billion, a 2% increase from the $31.1 billion reported in 2022. [12] From Q4 2023 to Q3 2024, YouTube's combined revenue from advertising and subscriptions exceeded $50 billion. [13]
Since its purchase by Google, YouTube has expanded beyond the core website into mobile apps, network television, and the ability to link with other platforms. Video categories on YouTube include music videos, video clips, news, short and feature films, songs, documentaries, movie trailers, teasers, TV spots, live streams, vlogs, and more. Most content is generated by individuals, including collaborations between "YouTubers" and corporate sponsors. Established media, news, and entertainment corporations have also created and expanded their visibility to YouTube channels in order to reach greater audiences.
YouTube has had unprecedented social impact, influencing popular culture, internet trends, and creating multimillionaire celebrities. Despite its growth and success, the platform has been criticized for its facilitation of the spread of misinformation and copyrighted content, routinely violating its users' privacy, excessive censorship, endangering the safety of children and their well-being, and for its inconsistent implementation of platform guidelines.
YouTube was founded by Steve Chen, Chad Hurley, and Jawed Karim. The trio were early employees of PayPal, which left them enriched after the company was bought by eBay. [14] Hurley had studied design at the Indiana University of Pennsylvania, and Chen and Karim studied computer science together at the University of Illinois Urbana-Champaign. [15]
According to a story that has often been repeated in the media, Hurley and Chen developed the idea for YouTube during the early months of 2005, after they had experienced difficulty sharing videos that had been shot at a dinner party at Chen's apartment in San Francisco. Karim did not attend the party and denied that it had occurred, but Chen remarked that the idea that YouTube was founded after a dinner party "was probably very strengthened by marketing ideas around creating a story that was very digestible". [16]
Karim said the inspiration for YouTube came from the Super Bowl XXXVIII halftime show controversy, when Janet Jackson's breast was briefly exposed by Justin Timberlake during the halftime show. Karim could not easily find video clips of the incident and the 2004 Indian Ocean Tsunami online, which led to the idea of a video-sharing site. [17] [18] Hurley and Chen said that the original idea for YouTube was a video version of an online dating service and had been influenced by the website Hot or Not. [16] [19] They created posts on Craigslist asking attractive women to upload videos of themselves to YouTube in exchange for a $100 reward. [20] Difficulty in finding enough dating videos led to a change of plans, with the site's founders deciding to accept uploads of any video. [21]
YouTube began as a venture capital–funded technology startup. Between November 2005 and April 2006, the company raised money from various investors, with Sequoia Capital and Artis Capital Management being the largest two. [14] [22] YouTube's early headquarters were situated above a pizzeria and a Japanese restaurant in San Mateo, California. [23] In February 2005, the company activated www.youtube.com
. [24] The first video was uploaded on April 23, 2005. Titled "Me at the zoo", it shows co-founder Jawed Karim at the San Diego Zoo and can still be viewed on the site. [25] [26] The same day, the company launched a public beta and by November, a Nike ad featuring Ronaldinho became the first video to reach one million total views. [27] [28] The site launched officially on December 15, 2005, by which time the site was receiving 8 million views a day. [29] [30] Clips at the time were limited to 100 megabytes, as little as 30 seconds of footage. [31]
YouTube was not the first video-sharing site on the Internet; Vimeo was launched in November 2004, though that site remained a side project of its developers from CollegeHumor. [32] The week of YouTube's launch, NBC-Universal's Saturday Night Live ran a skit "Lazy Sunday" by The Lonely Island. Besides helping to bolster ratings and long-term viewership for Saturday Night Live, "Lazy Sunday"'s status as an early viral video helped establish YouTube as an important website. [33] Unofficial uploads of the skit to YouTube drew in more than five million collective views by February 2006 before they were removed when NBCUniversal requested it two months later based on copyright concerns. [34] Despite eventually being taken down, these duplicate uploads of the skit helped popularize YouTube's reach and led to the upload of more third-party content. [35] [36] The site grew rapidly; in July 2006, the company announced that more than 65,000 new videos were being uploaded every day and that the site was receiving 100 million video views per day. [37]
The choice of the name www.youtube.com
led to problems for a similarly named website, www.utube.com
. That site's owner, Universal Tube & Rollform Equipment, filed a lawsuit against YouTube in November 2006, after being regularly overloaded by people looking for YouTube. Universal Tube subsequently changed its website to www.utubeonline.com
. [38] [39]
On October 9, 2006, Google announced that they had acquired YouTube for $1.65 billion in Google stock. [40] [41] The deal was finalized on November 13, 2006. [42] [43] Google's acquisition launched newfound interest in video-sharing sites; IAC, which now owned Vimeo, focused on supporting the content creators to distinguish itself from YouTube. [32] It is at this time YouTube issued the slogan "Broadcast Yourself". The company experienced rapid growth. The Daily Telegraph wrote that in 2007, YouTube consumed as much bandwidth as the entire Internet in 2000. [44] By 2010, the company had reached a market share of around 43% and more than 14 billion views of videos, according to comScore. [45] That year, the company simplified its interface to increase the time users would spend on the site. [46] In 2011, more than three billion videos were being watched each day with 48 hours of new videos uploaded every minute. [47] [48] [49] However, most of these views came from a relatively small number of videos; according to a software engineer at that time, 30% of videos accounted for 99% of views on the site. [50] That year, the company again changed its interface and at the same time, introduced a new logo with a darker shade of red. [51] [52] A subsequent interface change, designed to unify the experience across desktop, TV, and mobile, was rolled out in 2013. [53] By that point, more than 100 hours were being uploaded every minute, increasing to 300 hours by November 2014. [54] [55]
During this time, the company also went through some organizational changes. In October 2006, YouTube moved to a new office in San Bruno, California. [56] Hurley announced that he would be stepping down as chief executive officer of YouTube to take an advisory role and that Salar Kamangar would take over as head of the company in October 2010. [57]
In December 2009, YouTube partnered with Vevo. [58] In April 2010, Lady Gaga's "Bad Romance" became the most viewed video, becoming the first video to reach 200 million views on May 9, 2010. [59]
YouTube faced a major lawsuit by Viacom International in 2011 that nearly resulted in the discontinuation of the website. The lawsuit was filed as a result of alleged copyright infringement of Viacom's material by YouTube. However, the United States Court of Appeals for the Second Circuit ruled that YouTube was not liable, and thus YouTube won the case in 2012. [60]
Susan Wojcicki was appointed CEO of YouTube in February 2014. [61] In January 2016, YouTube expanded its headquarters in San Bruno by purchasing an office park for $215 million. The complex has 51,468 square metres (554,000 square feet) of space and can house up to 2,800 employees. [62] YouTube officially launched the "polymer" redesign of its user interfaces based on Material Design language as its default, as well a redesigned logo that is built around the service's play button emblem in August 2017. [63]
Through this period, YouTube tried several new ways to generate revenue beyond advertisements. In 2013, YouTube launched a pilot program for content providers to offer premium, subscription-based channels. [64] [65] This effort was discontinued in January 2018 and relaunched in June, with US$4.99 channel subscriptions. [66] [67] These channel subscriptions complemented the existing Super Chat ability, launched in 2017, which allows viewers to donate between $1 and $500 to have their comment highlighted. [68] In 2014, YouTube announced a subscription service known as "Music Key", which bundled ad-free streaming of music content on YouTube with the existing Google Play Music service. [69] The service continued to evolve in 2015 when YouTube announced YouTube Red, a new premium service that would offer ad-free access to all content on the platform (succeeding the Music Key service released the previous year), premium original series, and films produced by YouTube personalities, as well as background playback of content on mobile devices. YouTube also released YouTube Music, a third app oriented towards streaming and discovering the music content hosted on the YouTube platform. [70] [71] [72]
The company also attempted to create products appealing to specific viewers. YouTube released a mobile app known as YouTube Kids in 2015, designed to provide an experience optimized for children. It features a simplified user interface, curated selections of channels featuring age-appropriate content, and parental control features. [73] Also in 2015, YouTube launched YouTube Gaming—a video gaming-oriented vertical and app for videos and live streaming, intended to compete with the Amazon.com-owned Twitch. [74]
The company was attacked on April 3, 2018, when a shooting occurred at YouTube's headquarters in San Bruno, California, which wounded four and resulted in the death of the shooter. [75]
By February 2017, one billion hours of YouTube videos were being watched every day, and 400 hours worth of videos were uploaded every minute. [7] [76] Two years later, the uploads had risen to more than 500 hours per minute. [8] During the COVID-19 pandemic, when most of the world was under stay-at-home orders, usage of services like YouTube significantly increased. One data firm[ which? ] estimated that YouTube was accounting for 15% of all internet traffic, twice its pre-pandemic level. [77] In response to EU officials requesting that such services reduce bandwidth as to make sure medical entities had sufficient bandwidth to share information, YouTube and Netflix stated they would reduce streaming quality for at least thirty days as to cut bandwidth use of their services by 25% to comply with the EU's request. [78] YouTube later announced that they would continue with this move worldwide: "We continue to work closely with governments and network operators around the globe to do our part to minimize stress on the system during this unprecedented situation." [79]
Following a 2018 complaint alleging violations of the Children's Online Privacy Protection Act (COPPA), [80] the company was fined $170 million by the FTC for collecting personal information from minors under the age of 13. [81] YouTube was also ordered to create systems to increase children's privacy. [82] [83] Following criticisms of its implementation of those systems, YouTube started treating all videos designated as "made for kids" as liable under COPPA on January 6, 2020. [84] [85] Joining the YouTube Kids app, the company created a supervised mode, designed more for tweens, in 2021. [86] Additionally, to compete with TikTok, YouTube released YouTube Shorts, a short-form video platform. [87]
During this period, YouTube entered disputes with other tech companies. For over a year, in 2018 and 2019, no YouTube app was available for Amazon Fire products. [88] In 2020, Roku removed the YouTube TV app from its streaming store after the two companies were unable to reach an agreement. [89]
After testing earlier in 2021, YouTube removed public display of dislike counts on videos in November 2021, claiming the reason for the removal was, based on its internal research, that users often used the dislike feature as a form of cyberbullying and brigading. [90] While some users praised the move as a way to discourage trolls, others felt that hiding dislikes would make it harder for viewers to recognize clickbait or unhelpful videos and that other features already existed for creators to limit bullying. YouTube co-founder Jawed Karim referred to the update as "a stupid idea", and that the real reason behind the change was "not a good one, and not one that will be publicly disclosed." He felt that users' ability on a social platform to identify harmful content was essential, saying, "The process works, and there's a name for it: the wisdom of the crowds. The process breaks when the platform interferes with it. Then, the platform invariably declines." [91] [92] [93] Shortly after the announcement, software developer Dmitry Selivanov created Return YouTube Dislike, an open-source, third-party browser extension for Chrome and Firefox that allows users to see a video's number of dislikes. [94] In a letter published on January 25, 2022, by then YouTube CEO Susan Wojcicki, acknowledged that removing public dislike counts was a controversial decision, but reiterated that she stands by this decision, claiming that "it reduced dislike attacks." [95]
In 2022, YouTube launched an experiment where the company would show users who watched longer videos on TVs a long chain of short un-skippable adverts, intending to consolidate all ads into the beginning of a video. Following public outrage over the unprecedented amount of un-skippable ads, YouTube "ended" the experiment on September 19 of that year. [96] In October, YouTube announced that they would be rolling out customizable user handles in addition to channel names, which would also become channel URLs. [97]
On February 16, 2023, Wojcicki announced that she would step down as CEO, with Neal Mohan named as her successor. Wojcicki took on an advisory role for Google and parent company Alphabet. [98] Wojcicki died a year and a half later, on August 9, 2024. [99]
In late October 2023, YouTube began cracking down on the use of ad blockers on the platform. Users of ad blockers may be given a pop-up warning saying "Video player will be blocked after 3 videos". Users of ad blockers are shown a message asking them to allow ads or inviting them to subscribe to the ad-free YouTube Premium subscription plan. YouTube says that the use of ad blockers violates its terms of service. [100] [101]
In April 2024, YouTube announced it would be "strengthening our enforcement on third-party apps that violate YouTube's Terms of Service, specifically ad-blocking apps". [102]
YouTube has been led by a CEO since its founding in 2005, beginning with Chad Hurley, who led the company until 2010. After Google's acquisition of YouTube, the CEO role was retained. Salar Kamangar took over Hurley's position and kept the job until 2014. He was replaced by Susan Wojcicki, who later resigned in 2023. [98] The current CEO is Neal Mohan, who was appointed on February 16, 2023. [98]
YouTube offers different features based on user verification, such as standard or basic features like uploading videos, creating playlists, and using YouTube Music, with limits based on daily activity (verification via phone number or channel history increases feature availability and daily usage limits); intermediate or additional features like longer videos (over 15 minutes), live streaming, custom thumbnails, and creating podcasts; advanced features like content ID appeals, embedding live streams, applying for monetization, clickable links, adding chapters, and pinning comments on videos or posts. [103]
In January 2012, it was estimated that visitors to YouTube spent an average of 15 minutes a day on the site, in contrast to the four or five hours a day spent by a typical US citizen watching television. [104] In 2017, viewers on average watched YouTube on mobile devices for more than an hour every day. [105]
In December 2012, two billion views were removed from the view counts of Universal and Sony music videos on YouTube, prompting a claim by The Daily Dot that the views had been deleted due to a violation of the site's terms of service, which ban the use of automated processes to inflate view counts. This was disputed by Billboard, which said that the two billion views had been moved to Vevo, since the videos were no longer active on YouTube. [106] [107] On August 5, 2015, YouTube patched the formerly notorious behavior which caused a video's view count to freeze at "301" (later "301+") until the actual count was verified to prevent view count fraud. [108] YouTube view counts once again updated in real time. [109]
Since September 2019, subscriber counts are abbreviated. Only three leading digits of channels' subscriber counts are indicated publicly, compromising the function of third-party real-time indicators such as that of Social Blade. Exact counts remain available to channel operators inside YouTube Studio. [110]
On November 11, 2021, after testing out this change in March of the same year, YouTube announced it would start hiding dislike counts on videos, making them invisible to viewers. The company stated the decision was in response to experiments which confirmed that smaller YouTube creators were more likely to be targeted in dislike brigading and harassment. Creators will still be able to see the number of likes and dislikes in the YouTube Studio dashboard tool, according to YouTube. [111] [112] [113]
YouTube has an estimate 14 billion videos [10] with about 5% of those never having a view and just over 85% of them have fewer than 1,000 views. [114]
YouTube has faced numerous challenges and criticisms in its attempts to deal with copyright, including the site's first viral video, Lazy Sunday, which had to be taken down, due to copyright concerns. [33] At the time of uploading a video, YouTube users are shown a message asking them not to violate copyright laws. [115] Despite this advice, many unauthorized clips of copyrighted material remain on YouTube. YouTube does not view videos before they are posted online, and it is left to copyright holders to issue a DMCA takedown notice pursuant to the terms of the Online Copyright Infringement Liability Limitation Act. Any successful complaint about copyright infringement results in a YouTube copyright strike. Three successful complaints for copyright infringement against a user account will result in the account and all of its uploaded videos being deleted. [116] [117] From 2007 to 2009 organizations including Viacom, Mediaset, and the English Premier League have filed lawsuits against YouTube, claiming that it has done too little to prevent the uploading of copyrighted material. [118] [119] [120]
In August 2008, a US court ruled in Lenz v. Universal Music Corp. that copyright holders cannot order the removal of an online file without first determining whether the posting reflected fair use of the material. [121] YouTube's owner Google announced in November 2015 that they would help cover the legal cost in select cases where they believe fair use defenses apply. [122]
In the 2011 case of Smith v. Summit Entertainment LLC , professional singer Matt Smith sued Summit Entertainment for the wrongful use of copyright takedown notices on YouTube. [123] He asserted seven causes of action, and four were ruled in Smith's favor. [124] In April 2012, a court in Hamburg ruled that YouTube could be held responsible for copyrighted material posted by its users. [125] On November 1, 2016, the dispute with GEMA was resolved, with Google content ID being used to allow advertisements to be added to videos with content protected by GEMA. [126]
In April 2013, it was reported that Universal Music Group and YouTube have a contractual agreement that prevents content blocked on YouTube by a request from UMG from being restored, even if the uploader of the video files a DMCA counter-notice. [127] [128] As part of YouTube Music, Universal and YouTube signed an agreement in 2017, which was followed by separate agreements other major labels, which gave the company the right to advertising revenue when its music was played on YouTube. [129] By 2019, creators were having videos taken down or demonetized when Content ID identified even short segments of copyrighted music within a much longer video, with different levels of enforcement depending on the record label. [130] Experts noted that some of these clips said qualified for fair use. [130]
In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from Viacom, which alleged that YouTube profited from content that it did not have the right to distribute. [131] The system, which was initially called "Video Identification" [132] [133] and later became known as Content ID, [134] creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database, and flags the video as a copyright violation if a match is found. [135] When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video.
An independent test in 2009 uploaded multiple versions of the same song to YouTube and concluded that while the system was "surprisingly resilient" in finding copyright violations in the audio tracks of videos, it was not infallible. [136] The use of Content ID to remove material automatically has led to controversy in some cases, as the videos have not been checked by a human for fair use. [137] If a YouTube user disagrees with a decision by Content ID, it is possible to fill in a form disputing the decision. [138]
Before 2016, videos were not monetized until the dispute was resolved. Since April 2016, videos continue to be monetized while the dispute is in progress, and the money goes to whoever won the dispute. [139] Should the uploader want to monetize the video again, they may remove the disputed audio in the "Video Manager". [140] YouTube has cited the effectiveness of Content ID as one of the reasons why the site's rules were modified in December 2010 to allow some users to upload videos of unlimited length. [141]
YouTube has a set of community guidelines aimed to reduce abuse of the site's features. The uploading of videos containing defamation, pornography, and material encouraging criminal conduct is forbidden by YouTube's "Community Guidelines". [142] [ better source needed ] Generally prohibited material includes sexually explicit content, videos of animal abuse, shock videos, content uploaded without the copyright holder's consent, hate speech, spam, and predatory behavior. [142] YouTube relies on its users to flag the content of videos as inappropriate, and a YouTube employee will view a flagged video to determine whether it violates the site's guidelines. [142] Despite the guidelines,YouTube has faced criticism over aspects of its operations, [143] its recommendation algorithms perpetuating videos that promote conspiracy theories and falsehoods, [144] hosting videos ostensibly targeting children but containing violent or sexually suggestive content involving popular characters, [145] videos of minors attracting pedophilic activities in their comment sections, [146] and fluctuating policies on the types of content that is eligible to be monetized with advertising. [143]
YouTube contracts companies to hire content moderators, who view content flagged as potentially violating YouTube's content policies and determines if they should be removed. In September 2020, a class-action suit was filed by a former content moderator who reported developing post-traumatic stress disorder (PTSD) after an 18-month period on the job. [147] [148] [149]
Controversial moderation decisions have included material relating to Holocaust denial, [150] the Hillsborough disaster, [151] Anthony Bourdain's death, [152] and the Notre-Dame fire. [153] In July 2008, the Culture and Media Committee of the House of Commons of the United Kingdom stated that it was "unimpressed" with YouTube's system for policing its videos, and argued that "proactive review of content should be standard practice for sites hosting user-generated content". [154]
In June 2022, Media Matters, a media watchdog group, reported that homophobic and transphobic content calling LGBT people "predators" and "groomers" was becoming more common on YouTube. [155] The report also referred to common accusations in YouTube videos that LGBT people are mentally ill. [155] The report stated the content appeared to be in violation of YouTube's hate speech policy. [155]
An August 2022 report by the Center for Countering Digital Hate, a British think tank, found that harassment against women was flourishing on YouTube. [156] In his 2022 book Like, Comment, Subscribe: Inside YouTube's Chaotic Rise to World Domination, Bloomberg reporter Mark Bergen said that many female content creators were dealing with harassment, bullying, and stalking. [156]
YouTube has been criticized for using an algorithm that gives great prominence to videos that promote conspiracy theories, falsehoods and incendiary fringe discourse. [157] [158] [159] [160] According to an investigation by The Wall Street Journal, "YouTube's recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven't shown interest in such content. When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints." [157] [161] After YouTube drew controversy for giving top billing to videos promoting falsehoods and conspiracy when people made breaking-news queries during the 2017 Las Vegas shooting, YouTube changed its algorithm to give greater prominence to mainstream media sources. [157] [162] [163] [164]
In 2017, it was revealed that advertisements were being placed on extremist videos, including videos by rape apologists, anti-Semites, and hate preachers who received ad payouts. [165] After firms started to stop advertising on YouTube in the wake of this reporting, YouTube apologized and said that it would give firms greater control over where ads got placed. [165]
University of North Carolina professor Zeynep Tufekci has referred to YouTube as "The Great Radicalizer", saying "YouTube may be one of the most powerful radicalizing instruments of the 21st century." [166] Jonathan Albright of the Tow Center for Digital Journalism at Columbia University described YouTube as a "conspiracy ecosystem". [159] [167]
Before 2019, YouTube took steps to remove specific videos or channels related to supremacist content that had violated its acceptable use policies but otherwise did not have site-wide policies against hate speech. [168]
In the wake of the March 2019 Christchurch mosque attacks, YouTube and other sites like Facebook and Twitter that allowed user-submitted content drew criticism for doing little to moderate and control the spread of hate speech, which was considered to be a factor in the rationale for the attacks. [169] [170] These platforms were pressured to remove such content, but in an interview with The New York Times , YouTube's then chief product officer Neal Mohan said that unlike content such as ISIS videos which take a particular format and thus easy to detect through computer-aided algorithms, general hate speech was more difficult to recognize and handle, and thus could not readily take action to remove without human interaction. [171]
In May 2019, YouTube joined an initiative led by France and New Zealand with other countries and tech companies to develop tools to be used to block online hate speech and to develop regulations, to be implemented at the national level, to be levied against technology firms that failed to take steps to remove such speech, though the United States declined to participate. [172] [173] Subsequently, on June 5, 2019, YouTube announced a major change to its terms of service and further stated it would "remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place." [168] [174]
In June 2020, YouTube was criticized for allowing white supremacist content on its platform for years after it announced it would be pledging $1 million to fight racial injustice. [175] Later that month, it banned several channels associated with white supremacy, including those of Stefan Molyneux, David Duke, and Richard B. Spencer, asserting these channels violated their policies on hate speech. [176]
Multiple research studies have investigated cases of misinformation in YouTube. In a July 2019 study based on ten YouTube searches using the Tor Browser related to climate and climate change, the majority of videos were videos that communicated views contrary to the scientific consensus on climate change. [177] A May 2023 study found that YouTube was monetizing and profiting from videos that included misinformation about climate change. [178] A 2019 BBC investigation of YouTube searches in ten different languages found that YouTube's algorithm promoted health misinformation, including fake cancer cures. [179] In Brazil, YouTube has been linked to pushing pseudoscientific misinformation on health matters, as well as elevated far-right fringe discourse and conspiracy theories. [180] In the Philippines, numerous channels disseminated misinformation related to the 2022 Philippine elections. [181] Additionally, research on the dissemination of Flat Earth beliefs in social media, has shown that networks of YouTube channels form an echo chamber that polarizes audiences by appearing to confirm preexisting beliefs. [182]
In 2018, YouTube introduced a system that would automatically add information boxes to videos that its algorithms determined may present conspiracy theories and other fake news, filling the infobox with content from Encyclopædia Britannica and Wikipedia as a means to inform users to minimize misinformation propagation without impacting freedom of speech. [183] [184] In 2023, YouTube revealed its changes in handling content associated with eating disorders. This social media platform's Community Guidelines now prohibit content that could encourage emulation from at-risk users. [185]
In January 2019, YouTube said that it had introduced a new policy starting in the United States intended to stop recommending videos containing "content that could misinform users in harmful ways." YouTube gave flat earth theories, miracle cures, and 9/11 truther-isms as examples. [186] Efforts within YouTube engineering to stop recommending borderline extremist videos falling just short of forbidden hate speech, and track their popularity were originally rejected because they could interfere with viewer engagement. [187] In July 2022, YouTube announced policies to combat misinformation surrounding abortion, such as videos with instructions to perform abortion methods that are considered unsafe and videos that contain misinformation about the safety of abortion. [188] Google and YouTube implemented policies in October 2021 to deny monetization or revenue to advertisers or content creators that promoted climate change denial. [189] In January 2024, the Center for Countering Digital Hate reported that climate change deniers were instead pushing other forms of climate change denial that have not yet been banned by YouTube. [190] [191]
Following the dissemination via YouTube of misinformation related to the COVID-19 pandemic that 5G communications technology was responsible for the spread of coronavirus disease 2019 which led to multiple 5G towers in the United Kingdom being attacked by arsonists, YouTube removed all such videos linking 5G and the coronavirus in this manner. [192]
In September 2021, YouTube extended this policy to cover videos disseminating misinformation related to any vaccine, including those long approved against measles or Hepatitis B, that had received approval from local health authorities or the World Health Organization. [193] [194] The platform proceeded to remove the accounts of anti-vaccine campaigners such as Robert F. Kennedy Jr. and Joseph Mercola. [194] YouTube had extended this moderation to non-medical areas. In the weeks following the 2020 United States presidential election, the site added policies to remove or label videos promoting election fraud claims; [195] [196] however, it reversed this policy in June 2023, citing that the removal was necessary to "openly debate political ideas, even those that are controversial or based on disproven assumptions". [197] [198]
Leading into 2017, there was a significant increase in the number of videos related to children, coupled between the popularity of parents vlogging their family's activities, and previous content creators moving away from content that often was criticized or demonetized into family-friendly material. In 2017, YouTube reported that time watching family vloggers had increased by 90%. [199] [200] However, with the increase in videos featuring children, the site began to face several controversies related to child safety, including with popular channels FamilyOFive and Fantastic Adventures. [201] [202] [203] [204] [205]
Later that year, YouTube came under criticism for showing inappropriate videos targeted at children and often featuring popular characters in violent, sexual or otherwise disturbing situations, many of which appeared on YouTube Kids and attracted millions of views. The term "Elsagate" was coined on the Internet and then used by various news outlets to refer to this controversy. [206] [207] [208] [209] Following the criticism, YouTube announced it was strengthening site security to protect children from unsuitable content and the company started to mass delete videos and channels that made improper use of family-friendly characters. As part of a broader concern regarding child safety on YouTube, the wave of deletions also targeted channels that showed children taking part in inappropriate or dangerous activities under the guidance of adults. [210] [211] [212] [213] [214] [215]
Even for content that appears to be aimed at children and appears to contain only child-friendly content, YouTube's system allows for anonymity of who uploads these videos. These questions have been raised in the past, as YouTube has had to remove channels with children's content which, after becoming popular, then suddenly include inappropriate content masked as children's content. [216] The anonymity of such channel raise concerns because of the lack of knowledge of what purpose they are trying to serve. [217] The difficulty to identify who operates these channels "adds to the lack of accountability", according to Josh Golin of the Campaign for a Commercial-Free Childhood, and educational consultant Renée Chernow-O'Leary found the videos were designed to entertain with no intent to educate, all leading to critics and parents to be concerned for their children becoming too enraptured by the content from these channels. [216] Content creators that earnestly make child-friendly videos have found it difficult to compete with larger channels, unable to produce content at the same rate as them, and lacking the same means of being promoted through YouTube's recommendation algorithms that the larger animated channel networks have shared. [217]
In January 2019, YouTube officially banned videos containing "challenges that encourage acts that have an inherent risk of severe physical harm" (such as the Tide Pod Challenge) and videos featuring pranks that "make victims believe they're in physical danger" or cause emotional distress in children. [218]
In November 2017, it was revealed in the media that many videos featuring children—often uploaded by the minors themselves, and showing innocent content such as the children playing with toys or performing gymnastics—were attracting comments from pedophiles [219] [220] with predators finding the videos through private YouTube playlists or typing in certain keywords in Russian. [220] Other child-centric videos originally uploaded to YouTube began propagating on the dark web, and uploaded or embedded onto forums known to be used by pedophiles. [221]
As a result of the controversy, which added to the concern about "Elsagate", several major advertisers whose ads had been running against such videos froze spending on YouTube. [209] [222] In December 2018, The Times found more than 100 grooming cases in which children were manipulated into sexually implicit behavior (such as taking off clothes, adopting overtly sexual poses and touching other children inappropriately) by strangers. [223]
In February 2019, YouTube vlogger Matt Watson identified a "wormhole" that would cause the YouTube recommendation algorithm to draw users into this type of video content, and make all of that user's recommended content feature only these types of videos. [224] Most of these videos had comments from sexual predators commenting with timestamps of when the children were shown in compromising positions or otherwise making indecent remarks. [225] In the wake of the controversy, the service reported that they had deleted over 400 channels and tens of millions of comments, and reported the offending users to law enforcement and the National Center for Missing and Exploited Children. [226] [227] Despite these measures several large advertisers pulled their advertising from YouTube. [225] [228]
Subsequently, YouTube began to demonetize and block advertising on the types of videos that have drawn these predatory comments. [229] YouTube also began to flag channels that predominantly feature children, and preemptively disable their comments sections. [230] [231]
A related attempt to algorithmically flag videos containing references to the string "CP" (an abbreviation of child pornography) resulted in some prominent false positives involving unrelated topics using the same abbreviation. YouTube apologized for the errors and reinstated the affected videos. [232]
In June 2019, The New York Times cited researchers who found that users who watched erotic videos could be recommended seemingly innocuous videos of children. [233]
In 2021, two accounts linked to RT Deutsch, the German channel of the Russian RT network were removed as well for breaching YouTube's policies relating to COVID-19. [193] Russia threatened to ban YouTube after the platform deleted two German RT channels in September 2021. [234]
Shortly after the Russian invasion of Ukraine in 2022, YouTube removed all channels funded by the Russian state. [235] YouTube expanded the removal of Russian content from its site to include channels described as 'pro-Russian'. In June 2022, the War Gonzo channel run by Russian military blogger and journalist Semyon Pegov was deleted. [236]
In July 2023, YouTube removed the channel of British journalist Graham Phillips, active in covering the War in Donbas from 2014. [237]
In August 2023, a Moscow court fined Google 3 million rubles, around $35,000, for not deleting what it said was "fake news about the war in Ukraine". [238]
In October 2024, a Russian court has fined its parent company Google a grand total of 2 undecillion rubles (equivalent to US$20 decillion) for restricting Russian state media channels on YouTube. [239] The fine imposed by Russia is far greater than the world's total GDP, estimated at US$110 trillion by the International Monetary Fund. [240] News agency TASS reported that Google is allowed to return to the Russian market only if it complies with the court’s decision. [241] Kremlin spokesperson Dmitry Peskov labeled the court decision as "symbolic" and warned Google that it “should not be restricting the actions of our broadcasters on its platform.” [242]
YouTube featured an April Fools prank on the site on April 1 of every year from 2008 to 2016. In 2008, all links to videos on the main page were redirected to Rick Astley's music video "Never Gonna Give You Up", a prank known as "rickrolling". [243] [244] The next year, when clicking on a video on the main page, the whole page turned upside down, which YouTube claimed was a "new layout". [245] In 2010, YouTube temporarily released a "TEXTp" mode which rendered video imagery into ASCII art letters "in order to reduce bandwidth costs by $1 per second." [246]
The next year, the site celebrated its "100th anniversary" with a range of sepia-toned silent, early 1900s-style films, including a parody of Keyboard Cat. [247] In 2012, clicking on the image of a DVD next to the site logo led to a video about a purported option to order every YouTube video for home delivery on DVD. [248]
In 2013, YouTube teamed up with satirical newspaper company The Onion to claim in an uploaded video that the video-sharing website was launched as a contest which had finally come to an end, and would shut down for ten years before being re-launched in 2023, featuring only the winning video. The video starred several YouTube celebrities, including Antoine Dodson. A video of two presenters announcing the nominated videos streamed live for 12 hours. [249] [250]
In 2014, YouTube announced that it was responsible for the creation of all viral video trends, and revealed previews of upcoming trends, such as "Clocking", "Kissing Dad", and "Glub Glub Water Dance". [251] The next year, YouTube added a music button to the video bar that played samples from "Sandstorm" by Darude. [252] In 2016, YouTube introduced an option to watch every video on the platform in 360-degree mode with Snoop Dogg. [253]
YouTube Premium (formerly YouTube Red) is YouTube's premium subscription service. It offers advertising-free streaming, access to original programming, and background and offline video playback on mobile devices. [254] YouTube Premium was originally announced on November 12, 2014, as "Music Key", a subscription music streaming service, and was intended to integrate with and replace the existing Google Play Music "All Access" service. [255] [256] [257] On October 28, 2015, the service was relaunched as YouTube Red, offering ad-free streaming of all videos and access to exclusive original content. [258] [259] [260] As of November 2016 [update] , the service has 1.5 million subscribers, with a further million on a free-trial basis. [261] As of June 2017 [update] , the first season of YouTube Originals had received 250 million views in total. [262]
YouTube Kids is an American children's video app developed by YouTube, a subsidiary of Google. The app was developed in response to parental and government scrutiny on the content available to children. The app provides a version of the service-oriented towards children, with curated selections of content, parental control features, and filtering of videos deemed inappropriate viewing for children aged under 13, 8 or 5 depending on the age grouping chosen. First released on February 15, 2015, as an Android and iOS mobile app, the app has since been released for LG, Samsung, and Sony smart TVs, as well as for Android TV. On May 27, 2020, it became available on Apple TV. As of September 2019, the app is available in 69 countries, including Hong Kong and Macau, and one province. YouTube launched a web-based version of YouTube Kids on August 30, 2019.
On September 28, 2016, YouTube named Lyor Cohen, the co-founder of 300 Entertainment and former Warner Music Group executive, the Global Head of Music. [263]
In early 2018, Cohen began hinting at the possible launch of YouTube's new subscription music streaming service, a platform that would compete with other services such as Spotify and Apple Music. [264] On May 22, 2018, the music streaming platform named "YouTube Music" was launched. [265] [266]
YouTube Movies & TV is a video on demand service that offers movies and television shows for purchase or rental, depending on availability, along with a selection of movies (encompassing between 100 and 500 titles overall) that are free to stream, with interspersed ad breaks. YouTube began offering free-to-view movie titles to its users in November 2018; selections of new movies are added and others removed, unannounced each month. [267]
In March 2021, Google announced plans to gradually deprecate the Google Play Movies & TV app, and eventually migrate all users to the YouTube app's Movies & TV store to view, rent and purchase movies and TV shows (first affecting Roku, Samsung, LG, and Vizio smart TV users on July 15). [268] [269] Google Play Movies & TV formally shut down on January 17, 2024, with the web version of that platform migrated to YouTube as an expansion of the Movies & TV store to desktop users. (Other functions of Google Play Movies & TV were integrated into the Google TV service.) [270]
On November 1, 2022, YouTube launched Primetime Channels, a channel store platform offering third-party subscription streaming add-ons sold a la carte through the YouTube website and app, competing with similar subscription add-on stores operated by Apple, Prime Video and Roku. The add-ons can be purchased through the YouTube Movies & TV hub or through the official YouTube channels of the available services; subscribers of YouTube TV add-ons that are sold through Primetime Channels can also access their content via the YouTube app and website. A total of 34 streaming services (including Paramount+, Showtime, Starz, MGM+, AMC+ and ViX+) were initially available for purchase. [271] [272]
NFL Sunday Ticket, as part of a broader residential distribution deal with Google signed in December 2022 that also made it available to YouTube TV subscribers, was added to Primetime Channels as a standalone add-on on August 16, 2023. [273] [274] The ad-free tier of Max was added to Primetime Channels on December 12, 2023, coinciding with YouTube TV converting its separate HBO (for base plan subscribers) and HBO Max (for all subscribers) linear/VOD add-ons into a single combined Max offering. [275] [276] [note 1]
On February 28, 2017, in a press announcement held at YouTube Space Los Angeles, YouTube announced YouTube TV, an over-the-top MVPD-style subscription service that would be available for United States customers at a price of US$65 per month. Initially launching in five major markets (New York City, Los Angeles, Chicago, Philadelphia and San Francisco) on April 5, 2017, [277] [278] the service offers live streams of programming from the five major broadcast networks (ABC, CBS, The CW, Fox and NBC, along with selected MyNetworkTV affiliates and independent stations in certain markets), as well as approximately 60 cable channels owned by companies such as The Walt Disney Company, Paramount Global, Fox Corporation, NBCUniversal, Allen Media Group and Warner Bros. Discovery (including among others Bravo, USA Network, Syfy, Disney Channel, CNN, Cartoon Network, E!, Fox Sports 1, Freeform, FX and ESPN). [279] [280]
Subscribers can also receive premium cable channels (including HBO (via a combined Max add-on that includes in-app and log-in access to the service), Cinemax, Showtime, Starz and MGM+) and other subscription services (such as NFL Sunday Ticket, MLB.tv, NBA League Pass, Curiosity Stream and Fox Nation) as optional add-ons for an extra fee, and can access YouTube Premium original content. [279] [280] In September 2022, YouTube TV began allowing customers to purchase most of its premium add-ons (excluding certain services such as NBA League Pass and AMC+) without an existing subscription to its base package. [281]
In September 2016, YouTube Go was announced, [282] as an Android app created for making YouTube easier to access on mobile devices in emerging markets. It was distinct from the company's main Android app and allowed videos to be downloaded and shared with other users. It also allowed users to preview videos, share downloaded videos through Bluetooth, and offered more options for mobile data control and video resolution. [283]
In February 2017, YouTube Go was launched in India, and expanded in November 2017 to 14 other countries, including Nigeria, Indonesia, Thailand, Malaysia, Vietnam, the Philippines, Kenya, and South Africa. [284] [285] On February 1, 2018, it was rolled out in 130 countries worldwide, including Brazil, Mexico, Turkey, and Iraq. Before it shut down, the app was available to around 60% of the world's population. [286] [287] In May 2022, Google announced that they would be shutting down YouTube Go in August 2022. [288]
In September 2020, YouTube announced that it would be launching a beta version of a new platform of 15-second videos, similar to TikTok, called YouTube Shorts. [289] [290] The platform was first tested in India but as of March 2021 has expanded to other countries including the United States with videos now able to be up to 1 minute long. [291] The platform is not a standalone app, but is integrated into the main YouTube app. Like TikTok, it gives users access to built-in creative tools, including the possibility of adding licensed music to their videos. [292] The platform had its global beta launch in July 2021. [293]
In 2018, YouTube started testing a new feature initially called "YouTube Reels". [294] The feature was nearly identical to Instagram Stories and Snapchat Stories. YouTube later renamed the feature "YouTube Stories". It was only available to creators who had more than 10,000 subscribers and could only be posted/seen in the YouTube mobile app. [295] On May 25, 2023, YouTube announced that they would be shutting down this feature on June 26, 2023. [296] [297]
In November 2016, YouTube released YouTube VR, a dedicated version with an interface for VR devices, for Google's Daydream mobile VR platform on Android. [298] In November 2018, YouTube VR was released on the Oculus Store for the Oculus Go headset. [298] YouTube VR was updated since for compatibility with successive Quest devices, and was ported to Pico 4. [299]
YouTube VR allows for access to all YouTube-hosted videos, but particularly supports headset access for 360° and 180°-degree video (both in 2D and stereoscopic 3D). Starting with the Oculus Quest, the app was updated for compatibility with mixed-reality passthrough modes on VR headsets. In April 2024, YouTube VR was updated to support 8K SDR video on Meta Quest 3. [300]
Private individuals [301] and large production corporations [302] have used YouTube to grow their audiences. Indie creators have built grassroots followings numbering in the thousands at very little cost or effort, while mass retail and radio promotion proved problematic. [301] Concurrently, old media celebrities moved into the website at the invitation of a YouTube management that witnessed early content creators accruing substantial followings and perceived audience sizes potentially larger than that attainable by television. [302] While YouTube's revenue-sharing "Partner Program" made it possible to earn a substantial living as a video producer—its top five hundred partners each earning more than $100,000 annually [303] and its ten highest-earning channels grossing from $2.5 million to $12 million [304] —in 2012 CMU business editor characterized YouTube as "a free-to-use ... promotional platform for the music labels." [305] In 2013 Forbes ' Katheryn Thayer asserted that digital-era artists' work must not only be of high quality, but must elicit reactions on the YouTube platform and social media. [306] Videos of the 2.5% of artists categorized as "mega", "mainstream" and "mid-sized" received 90.3% of the relevant views on YouTube and Vevo in that year. [307] By early 2013, Billboard had announced that it was factoring YouTube streaming data into calculation of the Billboard Hot 100 and related genre charts. [308]
Observing that face-to-face communication of the type that online videos convey has been "fine-tuned by millions of years of evolution", TED curator Chris Anderson referred to several YouTube contributors and asserted that "what Gutenberg did for writing, online video can now do for face-to-face communication." [309] Anderson asserted that it is not far-fetched to say that online video will dramatically accelerate scientific advance, and that video contributors may be about to launch "the biggest learning cycle in human history." [309] In education, for example, the Khan Academy grew from YouTube video tutoring sessions for founder Salman Khan's cousin into what Forbes' Michael Noer called "the largest school in the world," with technology poised to disrupt how people learn. [310] YouTube was awarded a 2008 George Foster Peabody Award, [311] the website being described as a Speakers' Corner that "both embodies and promotes democracy." [312] The Washington Post reported that a disproportionate share of YouTube's most subscribed channels feature minorities, contrasting with mainstream television in which the stars are largely white. [313] A Pew Research Center study reported the development of "visual journalism", in which citizen eyewitnesses and established news organizations share in content creation. [314] The study also concluded that YouTube was becoming an important platform by which people acquire news. [315]
YouTube has enabled people to more directly engage with government, such as in the CNN/YouTube presidential debates (2007) in which ordinary people submitted questions to U.S. presidential candidates via YouTube video, with a techPresident co-founder saying that Internet video was changing the political landscape. [316] Describing the Arab Spring (2010–2012), sociologist Philip N. Howard quoted an activist's succinct description that organizing the political unrest involved using "Facebook to schedule the protests, Twitter to coordinate, and YouTube to tell the world." [317] In 2012, more than a third of the U.S. Senate introduced a resolution condemning Joseph Kony 16 days after the "Kony 2012" video was posted to YouTube, with resolution co-sponsor Senator Lindsey Graham remarking that the video "will do more to lead to (Kony's) demise than all other action combined." [318]
Conversely, YouTube has also allowed government to more easily engage with citizens, the White House's official YouTube channel being the seventh top news organization producer on YouTube in 2012 [321] and in 2013 a healthcare exchange commissioned Obama impersonator Iman Crosson's YouTube music video spoof to encourage young Americans to enroll in the Affordable Care Act (Obamacare)-compliant health insurance. [322] In February 2014, U.S. President Obama held a meeting at the White House with leading YouTube content creators not only to promote awareness of Obamacare [323] but more generally to develop ways for government to better connect with the "YouTube Generation." [319] Whereas YouTube's inherent ability to allow presidents to directly connect with average citizens was noted, the YouTube content creators' new media savvy was perceived necessary to better cope with the website's distracting content and fickle audience. [319]
Some YouTube videos have themselves had a direct effect on world events, such as Innocence of Muslims (2012) which spurred protests and related anti-American violence internationally. [324] TED curator Chris Anderson described a phenomenon by which geographically distributed individuals in a certain field share their independently developed skills in YouTube videos, thus challenging others to improve their own skills, and spurring invention and evolution in that field. [309] Journalist Virginia Heffernan stated in The New York Times that such videos have "surprising implications" for the dissemination of culture and even the future of classical music. [325]
A 2017 article in The New York Times Magazine posited that YouTube had become "the new talk radio" for the far right. [326] Almost a year before YouTube's January 2019 announcement that it would begin a "gradual change" of "reducing recommendations of borderline content and content that could misinform users in harmful ways", [327] Zeynep Tufekci had written in The New York Times that, "(g)iven its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century". [328] Under YouTube's changes to its recommendation engine, the most recommended channel evolved from conspiracy theorist Alex Jones (2016) to Fox News (2019). [329] According to a 2020 study, "An emerging journalistic consensus theorizes the central role played by the video 'recommendation engine', but we believe that this is premature. Instead, we propose the 'Supply and Demand' framework for analyzing politics on YouTube." [330] A 2022 study found that "despite widespread concerns that YouTube's algorithms send people down 'rabbit holes' with recommendations to extremist videos, little systematic evidence exists to support this conjecture", "exposure to alternative and extremist channel videos on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment.", and "contrary to the 'rabbit holes' narrative, non-subscribers are rarely recommended videos from alternative and extremist channels and seldom follow such recommendations when offered." [331]
The Legion of Extraordinary Dancers [332] and the YouTube Symphony Orchestra [333] selected their membership based on individual video performances. [309] [333] Further, the cyber-collaboration charity video "We Are the World 25 for Haiti (YouTube edition)" was formed by mixing performances of 57 globally distributed singers into a single musical work, [334] with The Tokyo Times noting the "We Pray for You" YouTube cyber-collaboration video as an example of a trend to use crowdsourcing for charitable purposes. [335] The anti-bullying It Gets Better Project expanded from a single YouTube video directed to discouraged or suicidal LGBT teens, [336] that within two months drew video responses from hundreds including U.S. President Barack Obama, Vice President Biden, White House staff, and several cabinet secretaries. [337] Similarly, in response to fifteen-year-old Amanda Todd's video "My story: Struggling, bullying, suicide, self-harm", legislative action was undertaken almost immediately after her suicide to study the prevalence of bullying and form a national anti-bullying strategy. [338] In May 2018, after London Metropolitan Police claimed that drill music videos glamorizing violence gave rise to gang violence, YouTube deleted 30 videos. [339]
Prior to 2020, Google did not provide detailed figures for YouTube's running costs, and YouTube's revenues in 2007 were noted as "not material" in a regulatory filing. [340] In June 2008, a Forbes magazine article projected the 2008 revenue at $200 million, noting progress in advertising sales. [341] In 2012, YouTube's revenue from its ads program was estimated at $3.7 billion. [342] In 2013, it nearly doubled and estimated to hit $5.6 billion according to e-Marketer, [342] [343] while others estimated $4.7 billion. [342] The vast majority of videos on YouTube are free to view and supported by advertising. [64] In May 2013, YouTube introduced a trial scheme of 53 subscription channels with prices ranging from $0.99 to $6.99 a month. [344] The move was seen as an attempt to compete with other providers of online subscription services such as Netflix, Amazon Prime, and Hulu. [64]
Google first published exact revenue numbers for YouTube in February 2020 as part of Alphabet's 2019 financial report. According to Google, YouTube had made US$15.1 billion in ad revenue in 2019, in contrast to US$8.1 billion in 2017 and US$11.1 billion in 2018. YouTube's revenues made up nearly 10% of the total Alphabet revenue in 2019. [345] [346] These revenues accounted for approximately 20 million subscribers combined between YouTube Premium and YouTube Music subscriptions, and 2 million subscribers to YouTube TV. [347]
YouTube had $29.2 billion ads revenue in 2022, up by $398 million from the prior year. [348] In Q2 2024, ad revenue rose to $8.66 billion, up 13% on Q1. [349]
YouTube entered into a marketing and advertising partnership with NBC in June 2006. [350] In March 2007, it struck a deal with BBC for three channels with BBC content, one for news and two for entertainment. [351] In November 2008, YouTube reached an agreement with MGM, Lions Gate Entertainment, and CBS, allowing the companies to post full-length films and television episodes on the site, accompanied by advertisements in a section for U.S. viewers called "Shows". The move was intended to create competition with websites such as Hulu, which features material from NBC, Fox, and Disney. [352] [353] In November 2009, YouTube launched a version of "Shows" available to UK viewers, offering around 4,000 full-length shows from more than 60 partners. [354] In January 2010, YouTube introduced an online film rentals service, [355] which is only available to users in the United States, Canada, and the UK as of 2010. [356] [357] [ needs update ] The service offers over 6,000 films. [358]
In March 2017, the government of the United Kingdom pulled its advertising campaigns from YouTube, after reports that its ads had appeared on videos containing extremist content. The government demanded assurances that its advertising would "be delivered safely and appropriately". The Guardian newspaper, as well as other major British and U.S. brands, similarly suspended their advertising on YouTube in response to their advertising appearing near offensive content. Google stated that it had "begun an extensive review of our advertising policies and have made a public commitment to put in place changes that give brands more control over where their ads appear". [359] [360] In early April 2017, the YouTube channel h3h3Productions presented evidence claiming that a Wall Street Journal article had fabricated screenshots showing major brand advertising on an offensive video containing Johnny Rebel music overlaid on a Chief Keef music video, citing that the video itself had not earned any ad revenue for the uploader. The video was retracted after it was found that the ads had been triggered by the use of copyrighted content in the video. [361] [362]
On April 6, 2017, YouTube announced that to "ensure revenue only flows to creators who are playing by the rules", it would change its practices to require that a channel undergo a policy compliance review, and have at least 10,000-lifetime views, before they may join the Partner Program. [363]
In May 2007, YouTube launched its Partner Program (YPP), a system based on AdSense which allows the uploader of the video to share the revenue produced by advertising on the site. [364] YouTube typically takes 45 percent of the advertising revenue from videos in the Partner Program, with 55 percent going to the uploader. [365] [366]
There are over two million members of the YouTube Partner Program. [367] According to TubeMogul, in 2013 a pre-roll advertisement on YouTube (one that is shown before the video starts) cost advertisers on average $7.60 per 1000 views. Usually, no more than half of the eligible videos have a pre-roll advertisement, due to a lack of interested advertisers. [368]
YouTube's policies restrict certain forms of content from being included in videos being monetized with advertising, including videos containing violence, strong language, sexual content, "controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown" (unless the content is "usually newsworthy or comedic and the creator's intent is to inform or entertain"), [369] and videos whose user comments contain "inappropriate" content. [370]
In 2013, YouTube introduced an option for channels with at least a thousand subscribers to require a paid subscription in order for viewers to watch videos. [371] [372] In April 2017, YouTube set an eligibility requirement of 10,000 lifetime views for a paid subscription. [373] On January 16, 2018, the eligibility requirement for monetization was changed to 4,000 hours of watch-time within the past 12 months and 1,000 subscribers. [373] The move was seen as an attempt to ensure that videos being monetized did not lead to controversy, but was criticized for penalizing smaller YouTube channels. [374]
YouTube Play Buttons, a part of the YouTube Creator Rewards, are a recognition by YouTube of its most popular channels. [375] The trophies made of nickel plated copper-nickel alloy, golden plated brass, silver plated metal, ruby, and red tinted crystal glass are given to channels with at least one hundred thousand, a million, ten million, fifty million subscribers, and one hundred million subscribers, respectively. [376] [377]
YouTube's policies on "advertiser-friendly content" restrict what may be incorporated into videos being monetized; this includes strong violence, language, [378] sexual content, and "controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown", unless the content is "usually newsworthy or comedic and the creator's intent is to inform or entertain". [379] In September 2016, after introducing an enhanced notification system to inform users of these violations, YouTube's policies were criticized by prominent users, including Philip DeFranco and Vlogbrothers. DeFranco argued that not being able to earn advertising revenue on such videos was "censorship by a different name". A YouTube spokesperson stated that while the policy itself was not new, the service had "improved the notification and appeal process to ensure better communication to our creators". [380] [381] [382] Boing Boing reported in 2019 that LGBT keywords resulted in demonetization. [383]
As of November 2020 in the United States, and June 2021 worldwide, [384] YouTube reserves the right to monetize any video on the platform, even if their uploader is not a member of the YouTube Partner Program. This will occur on channels whose content is deemed "advertiser-friendly", and all revenue will go directly to Google without any share given to the uploader. [385]
The majority of YouTube's advertising revenue goes to the publishers and video producers who hold the rights to their videos; the company retains 45% of the ad revenue. [386] In 2010, it was reported that nearly a third of the videos with advertisements were uploaded without permission of the copyright holders. YouTube gives an option for copyright holders to locate and remove their videos or to have them continue running for revenue. [387] In May 2013, Nintendo began enforcing its copyright ownership and claiming the advertising revenue from video creators who posted screenshots of its games. [388] In February 2015, Nintendo agreed to share the revenue with the video creators through the Nintendo Creators Program. [389] [390] [391] On March 20, 2019, Nintendo announced on Twitter that the company will end the Creators program. Operations for the program ceased on March 20, 2019. [392] [393]
YouTube has been censored, filtered, or banned for a variety of reasons, including: [394]
Access to specific videos is sometimes prevented due to copyright and intellectual property protection laws (e.g. in Germany), violations of hate speech, and preventing access to videos judged inappropriate for youth, [395] which is also done by YouTube with the YouTube Kids app and with "restricted mode". [396] Businesses, schools, government agencies, and other private institutions often block social media sites, including YouTube, due to its bandwidth limitations [397] [398] and the site's potential for distraction. [394] [399]
As of 2018 [update] , public access to YouTube is blocked in many countries, including China, North Korea, Iran, Turkmenistan, [400] Uzbekistan, [401] [402] Tajikistan, Eritrea, Sudan and South Sudan. In some countries, YouTube is blocked for more limited periods of time such as during periods of unrest, the run-up to an election, or in response to upcoming political anniversaries. In cases where the entire site is banned due to one particular video, YouTube will often agree to remove or limit access to that video in order to restore service. [394]
Reports emerged that since October 2019, comments posted with Chinese characters insulting the Chinese Communist Party (共匪 "communist bandit" or 五毛 "50 Cent Party", referring to state-sponsored commentators) were being automatically deleted within 15 seconds. [403]
Specific incidents where YouTube has been blocked include:
Google and its subsidiary companies, such as YouTube, have removed or omitted information from its services in order to comply with company policies, legal demands, and government censorship laws.
Vimeo, Inc. is an American video hosting, sharing, and services provider headquartered in New York City. Vimeo focuses on the delivery of high-definition video across a range of devices. Vimeo's business model is through software as a service (SaaS). They derive revenue by providing subscription plans for businesses and content creators. Vimeo provides its subscribers with tools for video creation, editing, and broadcasting, enterprise software solutions, as well as the means for video professionals to connect with clients and other professionals. As of December 2021, the site has 260 million users, with around 1.6 million subscribers to its services.
YouTube is an American online video-sharing platform headquartered in San Bruno, California, founded by three former PayPal employees—Chad Hurley, Steve Chen, and Jawed Karim—in February 2005. Google bought the site in November 2006 for US$1.65 billion, since which it operates as one of Google's subsidiaries.
Instagram is an American photo and video sharing social networking service owned by Meta Platforms. It allows users to upload media that can be edited with filters, be organized by hashtags, and be associated with a location via geographical tagging. Posts can be shared publicly or with preapproved followers. Users can browse other users' content by tags and locations, view trending content, like photos, and follow other users to add their content to a personal feed. A Meta-operated image-centric social media platform, it is available on iOS, Android, Windows 10, and the web. Users can take photos and edit them using built-in filters and other tools, then share them on other social media platforms like Facebook. It supports 32 languages including English, Hindi, Spanish, French, Korean, and Japanese.
Twitch is an American video live-streaming service that focuses on video game live streaming, including broadcasts of esports competitions, in addition to offering music broadcasts, creative content, and "in real life" streams. Twitch is operated by Twitch Interactive, a subsidiary of Amazon. It was introduced in June 2011 as a spin-off of the general-interest streaming platform Justin.tv. Content on the site can be viewed either live or via video on demand. The games shown on Twitch's current homepage are listed according to audience preference and include genres such as real-time strategy games (RTS), fighting games, racing games, and first-person shooters.
Google Play, also known as the Google Play Store or Play Store and formerly known as Android Market, is a digital distribution service operated and developed by Google. It serves as the official app store for certified devices running on the Android operating system and its derivatives, as well as ChromeOS, allowing users to browse and download applications developed with the Android software development kit and published through Google. Google Play has also served as a digital media store, offering games, music, books, movies, and television programs. Content that has been purchased on Google Play Movies & TV and Google Play Books can be accessed on a web browser and through the Android and iOS apps.
Felix Arvid Ulf Kjellberg, better known as PewDiePie, is a Swedish YouTuber known for his comedic videos. Kjellberg's popularity on YouTube and extensive media coverage has made him one of the most noted online personalities and content creators. He has been portrayed in media as a figurehead for YouTube, especially in the genre of gaming.
Vine was an American short-form video hosting service where users could share up to 6-second-long looping video clips. Founded in June 2012 by Rus Yusupov, Dom Hofmann and Colin Kroll, the company was bought by Twitter, Inc., four months later for $30 million. Vine launched with its iOS app on January 24, 2013, with Android and Windows versions following.
Google Play Music was a music and podcast streaming service and an online music locker operated by Google as part of its Google Play line of services. The service was announced on May 10, 2011; after a six-month, invitation-only beta period, it was publicly launched on November 16, 2011, and shut down in December 2020.
Roku is a brand of consumer electronics that includes streaming players, smart TVs, as well as a free TV streaming service. The brand is owned by Roku, Inc., an American company. As of 2024, Roku is the leading streaming TV distributor in the U.S., reaching nearly 120 million people.
Google TV is a digital distribution service for movies and television series developed by Google. It was announced in September 2020, offering search and discovery of video titles across multiple streaming services, including rental or purchase options, alongside watchlist features for accessing titles from eligible devices and platforms. The buy, rent, or preorder options were shared with the predecessor Google Play Movies & TV, which has since moved to the newer service.
YouTube Premium is a subscription service offered by the American video platform YouTube. The service provides ad-free access to content across the service, as well as access to premium YouTube Originals programming produced in collaboration with the site's creators, downloading videos and background playback of videos on mobile devices, and access to the music streaming service, YouTube Music, along with other benefits. It has over 100 million subscribers.
Brave is a free and open-source web browser developed by Brave Software, Inc. based on the Chromium web browser. Brave is a privacy-focused browser, which automatically blocks most advertisements and website trackers in its default settings. Users can turn on optional ads that reward them for their attention in the form of Basic Attention Tokens (BAT), which can be used as a cryptocurrency or to make donations to registered websites and content creators.
Discord is an instant messaging and VoIP social platform which allows communication through voice calls, video calls, text messaging, and media. Communication can be private or take place in virtual communities called "servers". A server is a collection of persistent chat rooms and voice channels which can be accessed via invite links. Discord runs on Windows, macOS, Android, iOS, iPadOS, Linux, and in web browsers. As of 2024, the service has about 150 million monthly active users and 19 million weekly active servers. It is primarily used by gamers, although the share of users interested in other topics is growing. As of March 2024, Discord is the 30th most visited website in the world with 22.98% of its traffic coming from the United States. As of March 2022, Discord employs 600 people globally.
YouTube Kids is an American video app and website for children developed by YouTube, a subsidiary of Google. The app provides a version of the service oriented solely towards children, with curated selections of content, parental control features, and filtering of videos deemed inappropriate for viewing by children under the age of 13, in accordance with the Children's Online Privacy Protection Act, which prohibits the regular YouTube app from advertising to children under the age of 13.
Deplatforming, also called no-platforming, is a form of Internet censorship of an individual or group by preventing them from posting on the platforms they use to share their information/ideas. This typically involves suspension, outright bans, or reducing spread.
Since its founding in 2005, the American video-sharing website YouTube has been faced with a growing number of privacy issues, including allegations that it allows users to upload unauthorized copyrighted material and allows personal information from young children to be collected without their parents' consent.
YouTube Shorts is the short-form section of the American online video-sharing platform YouTube. Shorts focuses on vertical videos that are less than 60 seconds of duration and various features for user interaction. As of May 2024, Shorts have collectively earned over 5 trillion views since the platform was made available to the general public on July 13, 2021, including views that pre-date the YouTube Shorts feature. Creators earn money based on the amount of views they receive, or through ad revenue. The increased popularity of YouTube Shorts has led to concerns about addiction for teenagers.
www.youtube.com
". DomainTools. Archived from the original on April 2, 2019. Retrieved April 1, 2009.In August of this year, YouTube announced that it would no longer allow creators to monetize videos which "made inappropriate use of family-friendly characters." Today it's taking another step to try to police this genre.