Notice and take down

Last updated

Notice and take down is a process operated by online hosts in response to court orders or allegations that content is illegal. Content is removed by the host following notice. Notice and take down is widely operated in relation to copyright infringement, as well as for libel and other illegal content. In United States and European Union law, notice and takedown is mandated as part of limited liability, or safe harbour, provisions for online hosts (see the Digital Millennium Copyright Act 1998 and the Electronic Commerce Directive 2000). As a condition for limited liability online hosts must expeditiously remove or disable access to content they host when they are notified of the alleged illegality. [1]

Contents

United States

The Online Copyright Infringement Liability Limitation Act, passed into law in 1998 as part of the Digital Millennium Copyright Act provides safe harbour protection to "online service providers" for "online storage" in section 512(c). Section 512(c) applies to online service providers that store copyright infringing material. In addition to the two general requirements that online service providers comply with standard technical measures and remove repeat infringers, section 512(c) also requires that the online service providers: 1) do not receive a financial benefit directly attributable to the infringing activity, 2) are not aware of the presence of infringing material or know any facts or circumstances that would make infringing material apparent, and 3) upon receiving notice from copyright owners or their agents, act expeditiously to remove the allegedly copyright infringing material. [2]

An online service provider can be notified through the copyright owner's written notification of claimed infringement. Section 512(c) lists a number of requirements the notification must comply with, including:

Provided the notification complies with the requirements of Section 512, the online service provider must expeditiously remove or disable access to the allegedly infringing material, otherwise the provider loses its safe harbour and is exposed to possible liability. [4]

The online service provider may additionally limit its liability for the removal of the material itself as well as its liability for restoring the removed material, by complying with a counter notification process. [5] [6] In this process, the service provider must promptly inform the subscriber of the removal of the content. [7] If the subscriber then objects via a counter notification, the service provider must notify the party which filed the original notice. [8] If the party does not bring a lawsuit against the subscriber within 14 days, the service provider must then restore the material to its location on its network. [9]

Like the original notification, the counter notification include specific elements: [10]

Implementing a counter notification process is not a requirement for the safe harbor protections. A service provider may decline to restore the allegedly infringing material, or to notify the subscriber at all, limiting the recourse available to the subscriber. [12]

If the court determines that the copyright owner misrepresented the claim of copyright infringement, the copyright owner becomes liable for any damages that resulted to the online service provider from the improper removal of the material. [13] The online service provider is also required to appropriately respond to "repeat infringers", including termination of online accounts. [14] On this basis online service providers may insert clauses into user service agreements which allow them to terminate or disable user accounts following repeat infringement of copyright. [15] Identification of "repeat infringer" may occur through repeated notice and takedown requests, while other online service provider require a determination by a court. [16]

European Union

The basis for notice and takedown procedures under EU law is article 14 of the Electronic Commerce Directive, adopted in 2000. Article 14 applies to content hosts in relation to all "illegal activity or information". Online hosts are not liable for the illegal activity, or information placed on its systems by a user, so long as the online host does not have "actual knowledge" of the activity or information. Upon obtaining such knowledge, the online host must act expeditiously to remove or to disable access to the information. [16] The Directive does not set out notice and takedown procedures but it envisaged the development of such a process because online hosts who fail to act expeditiously upon notification lose limited liability protection. The Directive suggests that voluntary agreements between trade bodies and consumer associations could specify notice and takedown processes, and that such initiatives should be encouraged by member states. [17]

In most EU countries at the national level, there are no explicit rules regarding notice of infringement, take-down process or counter notice and put back (statutory rules exist in smaller countries like Hungary and Finland). Where explicit rules do not exist (e.g. Germany), some aspects of notice requirements can be derived from common principles of law. [18] By nature, this lack of explicit rules results in a lack of clarity and legal certainty when compared to legal regimes with statutory rules (e.g. United States).

In October 2013, the European Court of Human Rights ruled in the Delfi AS v. Estonia case that the Estonian news website Delfi was liable for defamatory comments by users in an article. The court stated that the company "should have expected offensive posts, and exercised an extra degree of caution so as to avoid being held liable for damage to an individual’s reputation" and its notice and take down comments moderation system was "insufficient for preventing harm being cause to third parties". [19] [20]

India

In India takedown requests can happen through Section 69A of Information Technology Act, 2000. [21] [22]

Criticism

Notice and takedown has been criticised for over-blocking or take down of non-infringing content. In 2001 the Electronic Frontier Foundation launched a collaborative clearinghouse for notice and takedown requests, known as Chilling Effects . [23] Researchers have been using the clearinghouse to study the use of cease-and-desist demands, primarily looking at DMCA 512 takedown notices, but also non-DMCA copyright issues, and trademark claims. [24] [25] A 2005 study into the DMCA notice and take down process by Jennifer Urban and Laura Quilter from the Samuelson Law, Technology and Public Policy Clinic concluded that "some notices are sent in order to accomplish the paradigmatic goal of 512 – the inexpensive takedown of clearly infringing hosted content or links to infringing web sites". However, on the basis of data on such notices the study concluded that the DMCA notice and take down process "is commonly used for other purposes: to create leverage in a competitive marketplace, to protect rights not given by copyright (or perhaps any other law), and to stifle criticism, commentary and fair use". [26] However, it is misleading to conclude that these problems do not arise under the E-Commerce Directive, which does not provide for a statutory notice and take-down procedure, since these chilling effects are a specific problem of provider liability as such. [27]

In 2007 numerous US based online service providers hosting user generated content implemented content recognition technology to screen uploaded content for possible copyright infringement. These content ID systems, such as operated by YouTube, are outside the Digital Millennium Copyright Act mandated notice and takedown process. The Electronic Frontier Foundation, along with other civil society organisations published principles on user generated content, calling for the protection of legitimate use of copyright protected works, prior notification of the uploader before removal or the placement of ads on the content, use of the DMCA counter notice system, including reinstatement upon counter note and the failure of the copyright owner to bring a lawsuit. [16]

The Electronic Commerce Directive, unlike the Digital Millennium Copyright Act, did not define so called notice and action procedures under article 14 of the Directive. Member states implemented diverging approaches on the duty to act expeditiously and on when an online host obtains "actual knowledge" in relation to notifications. Inconsistent approaches to whether online service providers, such as search engines or social media networks, fall within the definition of online host, under article 14 developed across the EU. As a result, notice and takedown procedures are fragmented across EU member states and online hosts face considerable legal uncertainty. [28] The European Commission consulted on notice and action procedures under article 14 in 2010, and has launched a new initiative in June 2012. The European Commission observed that "Online intermediaries face high compliance costs and legal uncertainty because they typically have operations across Europe, but the basic rules of Article 14 are interpreted in different ways by different national courts (sometimes even within the same member state)." As part of the initiative the European Commission intends to clarify which online service providers fall within the article 14 definition of online hosts. The initiative assesses whether different categories of illegal content require different notice and action approaches. [29] [30] It seems that in 2013 the European Commission's notice and action initiative has come to a halt. The reason for this is unclear. One aspect might be to avoid bad publicity, since notice and take down is associated with chilling effects on free speech as described above. The other reason might be the following problem: the EU Commission already made it quite clear that it does not want to change the Electronic Commerce Directive – while indeed it seems impossible to provide legal certainty in the take down process without a binding legal underpinning. [31]

Notice and stay down

The term notice and stay down is used to refer to the concept of additionally requiring that a service, after it has received a request to take down a certain copyrighted work, must also prevent the same work from becoming available on the service again in the future. [32] [33] [34] Proposals for such concepts typically prescribe the implementation of automatic content recognition, similar to YouTube's "Content ID" system, that would proactively filter identified works and prevent them from being re-uploaded. Proposals for notice and stay down rules have been made in the United States by pro-copyright lobbyists, and constitute Article 17 of the EU's Directive on Copyright in the Digital Single Market. [33] [35] [36] [34] [37]

The concept of notice and stay down has faced criticism; it has been noted that the only way to reliably enforce such an obligation would be through automatic filtering, which is subject to the possibility of false positives, and the inability to detect lawful uses of an affected work (such as fair use). The Electronic Frontier Foundation argued that requiring proactive monitoring of user content would place the burden of copyright enforcement on service providers (thus defeating the purpose of safe harbors), and would be too costly for newly-established companies (thus bolstering incumbents and stifling innovation). [32] [33]

The implementation of Article 17 adopted by the German parliament includes safe harbour provisions intended to prevent false positives in situations "presumably authorised by law" (such as fair dealing rights), including that filters should not be applied automatically if an upload's use of copyrighted material is "minor" (defined as 160 characters of text, 125 kilobytes of image data, or video clips up to 15 seconds), in combination with other content, and using less than 50% of the original work. However, copyright holders may still oppose such use and issue takedowns, and providers must still provide "appropriate remuneration" to the copyright holder. [38] [39]

See also

Related Research Articles

An Act to amend the Copyright Act was a proposed law to amend the Copyright Act initiated by the Government of Canada in the First Session of the Thirty-Eighth Parliament. Introduced by the Minister of Canadian Heritage and Minister responsible for Status of Women Liza Frulla and then Minister of Industry David Emerson as An Act to Amend the Copyright Act, it received its First Reading in the House of Commons of Canada on June 20, 2005. On November 29, 2005, the opposition to the government tabled a non-confidence motion which passed, dissolving Parliament and effectively killing the bill. The subsequent government tabled a similar bill called C-61.

<i>Online Policy Group v. Diebold, Inc.</i>

Online Policy Group v. Diebold, Inc., 337 F. Supp. 2d 1195, was a lawsuit involving an archive of Diebold's internal company e-mails and Diebold's contested copyright claims over them. The Electronic Frontier Foundation and the Stanford Cyberlaw Clinic provided pro bono legal support for the non-profit ISP and the Swarthmore College students, respectively.

<span class="mw-page-title-main">Electronic Commerce Directive 2000</span> Directive of the European Parliament

The Electronic Commerce Directive in EU law sets up an Internal Market framework for online services. Its aim is to remove obstacles to cross-border online services in the EU internal market and provide legal certainty for businesses and consumers. It establishes harmonized rules on issues such as the transparency and information requirements for online service providers; commercial communications; and electronic contracts and limitations of liability of intermediary service providers. Finally, the Directive encourages the drawing up of voluntary codes of conduct and includes articles to enhance cooperation between Member States.

<span class="mw-page-title-main">Online Copyright Infringement Liability Limitation Act</span> 1998 U.S. federal law

The Online Copyright Infringement Liability Limitation Act (OCILLA) is United States federal law that creates a conditional 'safe harbor' for online service providers (OSP), a group which includes Internet service providers (ISP) and other Internet intermediaries, by shielding them for their own acts of direct copyright infringement as well as shielding them from potential secondary liability for the infringing acts of others. OCILLA was passed as a part of the 1998 Digital Millennium Copyright Act (DMCA) and is sometimes referred to as the "Safe Harbor" provision or as "DMCA 512" because it added Section 512 to Title 17 of the United States Code. By exempting Internet intermediaries from copyright infringement liability provided they follow certain rules, OCILLA attempts to strike a balance between the competing interests of copyright owners and digital users.

<span class="mw-page-title-main">Digital Millennium Copyright Act</span> United States copyright law

The Digital Millennium Copyright Act (DMCA) is a 1998 United States copyright law that implements two 1996 treaties of the World Intellectual Property Organization (WIPO). It criminalizes production and dissemination of technology, devices, or services intended to circumvent measures that control access to copyrighted works. It also criminalizes the act of circumventing an access control, whether or not there is actual infringement of copyright itself. In addition, the DMCA heightens the penalties for copyright infringement on the Internet. Passed on October 12, 1998, by a unanimous vote in the United States Senate and signed into law by President Bill Clinton on October 28, 1998, the DMCA amended Title 17 of the United States Code to extend the reach of copyright, while limiting the liability of the providers of online services for copyright infringement by their users.

<i>Perfect 10, Inc. v. CCBill, LLC</i>

Perfect 10, Inc. v. CCBill LLC, 488 F.3d 1102, is a U.S. court case between a publisher of an adult entertainment magazine and the webhosting, connectivity, and payment service companies. The plaintiff Perfect 10 asserted that defendants CCBill and CWIE violated copyright, trademark, and state law violation of right of publicity laws, unfair competition, false and misleading advertising by providing services to websites that posted images stolen from Perfect 10's magazine and website. Defendants sought to invoke statutory safe harbor exemptions from copyright infringement liability under the Digital Millennium Copyright Act, 17 U.S.C. § 512, and from liability for state law unfair competition, false advertising claims and right of publicity based on Section 230 of the Communications Decency Act, 47 U.S.C. § 230(c)(1).

<i>CoStar Group, Inc. v. LoopNet, Inc.</i>

CoStar Group, Inc. v. LoopNet, Inc., 373 F.3d 544, is a United States Court of Appeals for the Fourth Circuit decision about whether LoopNet should be held directly liable for CoStar Group’s copyrighted photographs posted by LoopNet’s subscribers on LoopNet’s website. The majority of the court ruled that since LoopNet was an Internet service provider ("ISP") that automatically and passively stored material at the direction of users, LoopNet did not copy the material in violation of the Copyright Act. The majority of the court also held that the screening process by a LoopNet employee before the images were stored and displayed did not alter the passivity of LoopNet. Judge Gregory dissented, stating that LoopNet had engaged in active, volitional conduct because of its screening process.

<i>Lenz v. Universal Music Corp.</i> U.S. District Court copyright case

Lenz v. Universal Music Corp., 801 F.3d 1126, is a decision by the United States Court of Appeals for the Ninth Circuit, holding that copyright owners must consider fair use defenses and good faith activities by alleged copyright infringers before issuing takedown notices for content posted on the Internet.

<i>IO Group, Inc. v. Veoh Networks, Inc.</i> 2008 US District Court case

IO Group, Inc. v. Veoh Networks, Inc., 586 F. Supp. 2d 1132, is an American legal case involving an internet television network named Veoh that allowed users of its site to view streaming media of various adult entertainment producer IO Group's films. The United States District Court for the Northern District of California ruled that Veoh qualified for the safe harbors provided by the Digital Millennium Copyright Act (DMCA), 17 U.S.C. § 512 (2006). According to commentators, this case could foreshadow the resolution of Viacom v. YouTube.

<span class="mw-page-title-main">Hotfile</span> File hosting website

Hotfile was a one-click file hosting website founded by Hotfile Corp in 2006 in Panama City, Panama. On December 4, 2013, Hotfile ceased all operations, the same day as signing a $4 million settlement with the Motion Picture Association of America (MPAA); the settlement had previously been misreported as $80 million.

<i>Capitol Records, Inc. v. MP3Tunes, LLC</i> 2011 US legal case

Capitol Records, Inc. v. MP3tunes, LLC is a 2011 case from the United States District Court for the Southern District of New York concerning copyright infringement and the Digital Millennium Copyright Act (DMCA). In the case, EMI Music Group and fourteen other record companies claimed copyright infringement against MP3tunes, which provides online music storage lockers, and MP3tunes's founder, Michael Robertson. In a decision that has ramifications for the future of online locker services, the court held that MP3tunes qualifies for safe harbor protection under the DMCA. However, the court found MP3tunes to still be liable for contributory copyright infringement in this case due to its failure to remove infringing songs after receiving takedown notices. The court also held that Robertson is liable for songs he personally copied from unauthorized websites.

<i>Flava Works Inc. v. Gunter</i> 2012 US decision on copyright infringement

Flava Works, Inc v. Gunter, 689 F.3d 754, is a decision by the United States Seventh Circuit Court of Appeals, authored by Judge Richard Posner, which held that Marques Gunter, the sole proprietor of the site myVidster.com, a social bookmarking website that enables its users to share videos posted elsewhere online through embedded frames, was not liable for its users' sharing and embedding of copyrighted videos. The court of appeals reversed the decision of the United States District Court for the Northern District of Illinois, which had granted a preliminary injunction against myVidster, citing sufficient knowledge of infringement on Gunter's part, while denying safe harbor defense under the Digital Millennium Copyright Act (DMCA). The Court held that Gunter was not directly liable because the copyrighted content was not stored on myVidster's servers, and was not contributorily liable because there was no evidence that conduct by myVidster increased the amount of infringement.

A Notice of Claimed Infringement or NOCI is a notice from the owner of a copyrighted material to an online service provider. The notice identifies copyrighted material, alleges unauthorized use, and demands expeditious removal. By complying with the demand, the online service provider is relieved of responsibility for the infringing activity of their users.

<i>Ouellette v. Viacom International Inc.</i> US legal case

Ouellette v. Viacom, No. 9:10-cv-00133; 2011 WL 1882780, found the safe harbor provision of the Digital Millennium Copyright Act (DMCA) did not create liability for service providers that take down non-infringing works. This case limited the claims that can be filed against service providers by establishing immunity for service providers' takedown of fair use material, at least from grounds under the DMCA. The court left open whether another "independent basis of liability" could serve as legal grounds for an inappropriate takedown.

<i>Amaretto Ranch Breedables, LLC v. Ozimals, Inc.</i>

Amaretto Ranch Breedables, LLC v. Ozimals, Inc. was a copyright case in the United States District Court for the Northern District of California involving a DMCA takedown notice dispute between companies that produce virtual animals on Second Life. Ozimals filed a DMCA takedown notice to Linden Research, the makers of Second life, claiming that Amaretto's horse infringed on their bunnies and demanding their removal. Consequently, Amaretto responded with a counter-DMCA notice and applied to the court for a temporary restraining order to forbid Linden Research from removing their virtual horses. This was granted and held in effect as the case proceeded. Amaretto claimed in court that Ozimal's DMCA notice was copyright misuse and asked for a declaration that its horses did not infringe copyright. Ozimals counterclaimed for copyright infringement. The court eventually dismissed both claims.

<i>Columbia Pictures Industries, Inc. v. Fung</i>

Columbia Pictures Industries, Inc. v. Fung 710 F.3d 1020 No. 10-55946, was a United States Court of Appeals for the Ninth Circuit case in which seven film studios including Columbia Pictures Industries, Inc., Disney and Twentieth Century Fox sued Gary Fung, the owner of isoHunt Web Technologies, Inc., for contributory infringement of their copyrighted works. The panel affirmed in part and vacated in part the decision of United States District Court for the Central District of California that the services and websites offered by isoHunt Web Technologies allowed third parties to download infringing copies of Columbia's works. Ultimately, Fung had "red flag knowledge" of the infringing activity on his systems, and therefore IsoHunt was held ineligible for the Digital Millennium Copyright Act § 512(c) safe harbor.

<i>UMG Recordings, Inc. v. Shelter Capital Partners LLC</i> United States Court of Appeals for the Ninth Circuit case

UMG Recordings, Inc. v. Shelter Capital Partners LLC, 667 F.3d 1022 No. 09-55902, was a United States Court of Appeals for the Ninth Circuit case in which UMG sued video-sharing website Veoh, alleging that Veoh committed copyright infringement by hosting user-uploaded videos copyrighted by UMG. The Ninth Circuit upheld the decision of the United States District Court for the Central District of California that Veoh is protected under the Digital Millennium Copyright Act's safe harbor provisions. It was established that service providers are "entitled to broad protection against copyright infringement liability so long as they diligently remove infringing material upon notice of infringement".

<i>Wolk v. Kodak Imaging Network, Inc.</i>

Wolk v. Kodak Imaging Network, Inc., 840 F. Supp. 2d 724, was a United States district court case in which the visual artist Sheila Wolk brought suit against Kodak Imaging Network, Inc., Eastman Kodak Company, and Photobucket.com, Inc. for copyright infringement. Users uploaded Wolk's work to Photobucket, a user-generated content provider, which had a revenue sharing agreement with Kodak that permitted users to use Kodak Gallery to commercially print (photofinish) images from Photobucket's site—including unauthorized copies of Wolk's artwork.

<i>Capitol Records, LLC v Vimeo, LLC</i> 2013 US District Court case

Capitol Records, LLC v. Vimeo, LLC, 972 F. Supp. 2d 500, 972 F. Supp. 2d 537, was a 2013 copyright infringement case out of the United States District Court for the Southern District of New York. The decision resolved cross-motions for summary judgment filed by a video-sharing service (Vimeo) and a pair of record labels. Vimeo sought a ruling that, as a matter of law, it was entitled to safe harbor protection under the Digital Millennium Copyright Act (DMCA) as to a series of copyrighted videos that were uploaded to its platform; the record labels sought the opposite ruling.

Contributory copyright infringement is a way of imposing secondary liability for infringement of a copyright. It is a means by which a person may be held liable for copyright infringement even though he or she did not directly engage in the infringing activity. It is one of the two forms of secondary liability apart from vicarious liability. Contributory infringement is understood to be a form of infringement in which a person is not directly violating a copyright but induces or authorizes another person to directly infringe the copyright.

References

  1. The Role of Internet Intermediaries in Advancing Public Policy Objectives. OECD Publishing. 4 October 2011. p. 144. ISBN   9789264115637.
  2. Section 512(c)
  3. 17 U.S.C. § 512(c)(3)(A)(i-vi))
  4. 17 U.S.C. § 512(c)(1)(C)
  5. 17 U.S.C. § 512(g)(1)
  6. 17 U.S.C. § 512(g)(4)
  7. 17 U.S.C. § 512(g)(2)(A)
  8. 17 U.S.C. § 512(g)(2)
  9. 17 U.S.C. § 512(g)(2)(C)
  10. 17 U.S.C. § 512(g)(3)
  11. 17 U.S.C. § 512(g)(3)(A), (B), (C) and (D)
  12. Bridy, Annemarie and Keller, Daphne (31 March 2016) U.S. Copyright Office Section 512 Study: Comments in Response to Notice of Inquiry. p. 29
  13. [512(f)]
  14. Reid, Amanda (2019). "Considering Fair Use: DMCA's Take Down & Repeat Infringers Policies". Communication Law & Policy. 24: 101–141. doi:10.1080/10811680.2018.1551036. SSRN   3348562.
  15. Reid, Amanda (2021). "Readability, Accessibility & Clarity: An Analysis of DMCA Repeat Infringer Policies". Jurimetrics. 61: 405–441. SSRN   3921231.
  16. 1 2 3 The Role of Internet Intermediaries in Advancing Public Policy Objectives. OECD Publishing. 4 October 2011. p. 146. ISBN   9789264115637.
  17. E-Copyright Law Handbook. Aspen Publishers Online. 2002. pp. 13–53. ISBN   9780735529441.
  18. Holznagel, Daniel (2013). Notice and Take-Down-Verfahren als Teil der Providerhaftung [Notice and Takedown Procedures as a Part of Provider Liability]. Mohr Siebeck. pp. 75–83, 125–220. ISBN   978-3-16-152667-1.
  19. European Court strikes serious blow to free speech online - Article 19, 14 October 2013
  20. The threat facing online comments - Financial Times, John Sunyer, 23 May 2014
  21. Mandavia, Megha (2 October 2019). "India sent most takedown requests to social media companies: Research". The Economic Times.
  22. "How India's Data Requests From Tech Giants Have Skyrocketed Over The Years". Inc42 Media. 12 November 2019.
  23. Gallagher, David (April 22, 2002). "New Economy; A copyright dispute with the Church of Scientology is forcing Google to do some creative linking". New York Times . Retrieved 2011-04-07.
  24. J. Urban & L. Quilter, "Efficient Process or 'Chilling Effects'? Takedown Notices Under Section 512 of the Digital Millennium Copyright Act," Santa Clara Computer & High Technology Law Journal (March 2006)
  25. "Will Fair Use Survive? Free Expression in the Age of Copyright Control" (2005). (PDF) Archived 2005-12-08 at the Wayback Machine Free Expression Policy Project
  26. Rimmer, Matthew (2007). Digital copyright and the consumer revolution: hands off my iPod. Edward Elgar Publishing. p. 191. ISBN   978-1-84542-948-5.
  27. Holznagel, Daniel (2014). Melde- und Abhilfeverfahren zur Beanstandung rechtswidrig gehosteter Inhalte. GRUR Int (2/2014), C.H.Beck. pp. 105–113.
  28. "German News Article Removed From Search Results After DMCA Complaint". EDRI. 20 June 2012. Retrieved 27 August 2012.
  29. Gothard, Peter (7 June 2012). "European Commission to more closely define online content 'host'". computing.co.uk. Retrieved 27 August 2012.
  30. "European Commission consults on notice and take-down procedures for online content". Practical Law Company. 4 June 2012. Retrieved 27 August 2012.
  31. Holznagel, Daniel (2014). Melde- und Abhilfeverfahren zur Beanstandung rechtswidrig gehosteter Inhalte. GRUR Int (2014), C.H.Beck. pp. 105–113.
  32. 1 2 Lemley, Chris Sprigman and Mark (21 June 2016). "Why notice-and-takedown is a bit of copyright law worth saving". Los Angeles Times. Retrieved 2018-06-23.
  33. 1 2 3 Harmon, Elliot (2016-01-21). ""Notice-and-Stay-Down" Is Really "Filter-Everything"". Electronic Frontier Foundation. Retrieved 2018-06-22.
  34. 1 2 "The Rebranding Of SOPA: Now Called 'Notice And Staydown'". Techdirt. Retrieved 2018-06-22.
  35. Romero-Moreno, Felipe (2018-05-29). "'Notice and staydown' and social media: amending Article 13 of the Proposed Directive on Copyright". International Review of Law, Computers & Technology. 33 (2): 187–210. doi: 10.1080/13600869.2018.1475906 . hdl: 2299/21370 . ISSN   1360-0869.
  36. Dredge, Stuart (2016-03-24). "British music labels demand 'notice and stay down' piracy policy from Google". the Guardian. Retrieved 2018-06-22.
  37. Romero-Moreno, Felipe (17 March 2020). "'Upload filters' and human rights: implementing Article 17 of the Directive on Copyright in the Digital Single Market". International Review of Law, Computers & Technology. 34 (2): 153–182. doi: 10.1080/13600869.2020.1733760 . hdl: 2299/20431 . ISSN   1360-0869.
  38. "German 'Upload Filter' Law Sets Standards to Prevent Overblocking". TorrentFreak. Retrieved 2021-05-26.
  39. Schmon, Christoph (2021-02-26). "From Creativity to Exclusivity: The German Government's Bad Deal for Article 17". Electronic Frontier Foundation. Retrieved 2021-05-26.