In May 2013, the German Federal Court of Justice stated that Google's predictions within the autocomplete function of its web search engine can violate the right of personality. [1] The right of personality ensures that a person's (or even a company's [2] ) personality (reputation) is respected and can be freely developed. [3] Only the individual shall, in principle, decide how he/she wants to present himself/herself to third parties and the public. [4]
A stock corporation, which sold food supplements and cosmetics online, and its chairman filed an action for an injunction and financial compensation against Google based on a violation of their right of personality. [6] Google runs a web search engine under the domain "www.google.de" (among others), which allows Internet users to search for information online and access third party content through a list of search results.
In 2009, Google implemented a so-called "autocomplete" function which shows word combinations as predictions for the search of the user in a new window while typing in a search term into the search mask. These predictions are based on an algorithm which evaluates the number of searches on specific terms of other users. If users typed the full name of the chairman into the search engine in May 2010 the autocomplete function showed the predictions "Betrug" (fraud) or "Scientology". The claimants stated that the chairman would have no connection to Scientology and that he was under no investigation for fraud. Furthermore, they argued that no search result would show a connection between the chairman and fraud or Scientology. Therefore, they saw these predictions as a violation of their right of personality.
The Regional Court Cologne decided in favour of Google and dismissed the case as unfounded. [7] The Higher Regional Court Cologne uphold this judgement. [8] The claimants filed an appeal to the German Federal Court of Justice.
The German Federal Court of Justice set aside the judgement of the Higher Regional Court Cologne and referred the case back to this court. [10]
The Federal Court of Justice held that
In April 2014, the Higher Regional Court Cologne then decided in favour of the claimants insofar as they objected to the additional term "Scientology" which Google initially refused to remove. [20] A financial compensation was not awarded because Google removed the entry later (about one and a half after the objection) and therefore limited the infringement. [21] Due to the fact that Google removed the additional term "Betrug" (fraud) immediately after the claimant's first objection, this part of the claim was unfounded. [22]
Some legal scholars argued that the judgement established a reasonable balance between the protection of the right of personality (by Google's obligation to remove and prevent infringing predictions after a notice), Google's interest to still provide the autocomplete function (without the need to monitor all predictions) and the Internet user's interest to make use of the search's improvement. [23]
The court's decision that the search engine provider has no obligation to monitor the predictions generated by a software beforehand and is only responsible if he becomes aware of the violation by the predictions corresponds with previous judgements [24] of the court on the "Stoererhaftung" (interferer's liability) of a host provider for content that third parties posted on the host provider's website. [25] However, due to the fact that these previous judgements discussed the liability for third party content, others stated that the fact that the court's autocomplete judgement is based on Google being an interferer ("Störer") within the "Stoererhaftung" (interferer's liability) – and not a perpetrator – contradicts the court's statement that the predictions have to be seen as Google's own content. [26]
Moreover, the judgement raises the question which result a trade-off between Google's freedom of expression and commercial freedom and another person's right of personality would have in other scenarios. [27] Depending on the specific circumstances, it could be more complicated to assess if a prediction is false or (even) true, but not worthier of protection than the right of personality (e.g. in a case in which an investigation for a crime – like fraud – already started or in which a person is actually the victim of a crime). [28]
Another interesting issue is the question to what extent Google is capable of legally evaluating and processing notifications by alleged victims of an infringement. [29] The current legal situation could be an incentive for Google to just remove the prediction after a complaint in order to avoid any liability. [30]
This judgement was not the only time a possible defamation by Google's autocomplete function was discussed in a courtroom. In Germany, Bettina Wulff, the wife of the former President of the Federal Republic of Germany Christian Wulff, filed for an action for an injunction regarding 43 predictions against Google at the Regional Court Hamburg based on a violation of her right of personality. [31] The word combinations included the words "Escort" (escort) and "Prostituierte" (prostitute). [32] However, in January 2015, Google deleted these predictions and the parties settled the lawsuit. [33] By taking legal actions against Google, Bettina Wulff probably also caused a so-called "Streisand effect" because many people learned about the predictions by the created media attention for the first time. [34]
In France, in 2010, the Superior Court of Paris ordered Google to cease suggesting certain predictions, including "rapist", "satanist", "rape", and "prison", to Internet users who search for a man's name. [35] The man, convicted for a "corruption of a minor" at the time, was still appealing his conviction. [36] In Italy, a businessman filed a defamation suit because of the terms "truffatore" (conman) and "truffa" (fraud) that were added to his name by the autocomplete function. [37] The Milan court ordered Google to remove these predictions in 2011. [38] Furthermore, in 2012, the Supreme Court of Victoria in Melbourne, Australia held Google liable for defamation by wrongly linking a private person to crimes he in fact was a victim of and awarded $200,000 in damages. [39] [40] Moreover, in 2013, the Tokyo District Court in Japan also ordered Google to modify its predictions and pay 300,000 yen ($3,100) as damages to a man which was linked to crimes he did not commit. [41]
However, Google's autocomplete function was not only subject of defamation suits. In another case, French human rights organisations (including SOS Racisme) sued Google for adding the word "juif" (Jewish) to the names of celebrities within its predictions. [42] The human rights organisations argued that Google provided "ethnic files" by suggesting these predictions, which is forbidden in France. [43] The parties settled in 2012 without revealing the details of the settlement. [44]
Today, Google provides an online form that allows Internet users to report an (allegedly) infringing prediction within the autocomplete function. [45]
The relevance of this judgement goes beyond the mere autocomplete function because it can be seen as a precedent on the question if algorithms can make defamatory statements. [46] With artificial intelligence [47] and robots becoming more and more widespread in our society, future scenarios, in which the liability for their actions has to be discussed, seem to be likely. [48]
Defamation is the act of communicating to a third party false statements about a person, place or thing that results in damage to its reputation. It can be spoken (slander) or written (libel). It constitutes a tort or a crime. The legal definition of defamation and related acts as well as the ways they are dealt with can vary greatly between countries and jurisdictions.
The Communications Decency Act of 1996 (CDA) was the United States Congress's first notable attempt to regulate pornographic material on the Internet. In the 1997 landmark case Reno v. ACLU, the United States Supreme Court unanimously struck the act's anti-indecency provisions.
Online service provider law is a summary and case law tracking page for laws, legal decisions and issues relating to online service providers (OSPs), like the Wikipedia and Internet service providers, from the viewpoint of an OSP considering its liability and customer service issues. See Cyber law for broader coverage of the law of cyberspace.
Stratton Oakmont, Inc. v. Prodigy Services Co., 23 Media L. Rep. 1794, is a 1995 U.S. New York Supreme Court decision holding that online service providers could be held liable for the speech of their users. The ruling caused controversy among early supporters of the Internet, including some lawmakers, leading to the passage of Section 230 of the Communications Decency Act in 1996.
Autocomplete, or word completion, is a feature in which an application predicts the rest of a word a user is typing. In Android and iOS smartphones, this is called predictive text. In graphical user interfaces, users can typically press the tab key to accept a suggestion or the down arrow key to accept one of several.
Holger Voss is a German Internet user who was sued in January 2003 for a sarcastic comment pertaining to the September 11, 2001 attacks in an Internet discussion forum, a case that attracted nationwide attention in Germany.
Privacy laws of the United States deal with several different legal concepts. One is the invasion of privacy, a tort based in common law allowing an aggrieved party to bring a lawsuit against an individual who unlawfully intrudes into their private affairs, discloses their private information, publicizes them in a false light, or appropriates their name for personal gain.
Barrett v. Rosenthal, 40 Cal.4th 33 (2006), was a California Supreme Court case concerning online defamation. The case resolved a defamation claim brought by Stephen Barrett, Terry Polevoy, and attorney Christopher Grell against Ilena Rosenthal and several others. Barrett and others alleged that the defendants had republished libelous information about them on the internet. In a unanimous decision, the court held that Rosenthal was a "user of interactive computer services" and therefore immune from liability under Section 230 of the Communications Decency Act.
Google and its subsidiary companies, such as YouTube, have removed or omitted information from its services in order to comply with company policies, legal demands, and government censorship laws.
Ripoff Report is a privately owned and operated for-profit website founded by Ed Magedson. The Ripoff Report has been online since December 1998 and is operated by Xcentric Ventures, LLC which is based in Tempe, Arizona.
File sharing is the practice of distributing or providing access to digital media, such as computer programs, multimedia, program files, documents or electronic books/magazines. It involves various legal aspects as it is often used to exchange data that is copyrighted or licensed.
In copyright law, the legal status of hyperlinking and that of framing concern how courts address two different but related Web technologies. In large part, the legal issues concern use of these technologies to create or facilitate public access to proprietary media content — such as portions of commercial websites. When hyperlinking and framing have the effect of distributing, and creating routes for the distribution of content (information) that does not come from the proprietors of the Web pages affected by these practices, the proprietors often seek the aid of courts to suppress the conduct, particularly when the effect of the conduct is to disrupt or circumvent the proprietors' mechanisms for receiving financial compensation.
Section 230 is a section of Title 47 of the United States Code that was enacted as part of the United States Communications Decency Act and generally provides immunity for website platforms with respect to third-party content. At its core, Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
A search suggest drop-down list is a query feature used in computing to show the searcher shortcuts, while the query is typed into a text box. Before the query is complete, a drop-down list with the suggested completions appears to provide options to select. The suggested queries then enable the searcher to complete the required search quickly. As a form of autocompletion, the suggestion list is distinct from search history in that it attempts to be predictive even when the user is searching for the first time. Data may come from popular searches, sponsors, geographic location or other sources. These lists are used by operating systems, web browsers and various websites, particularly search engines. Search suggestions are common with a 2014 survey finding that over 80% of e-commerce websites included them.
Notice and take down is a process operated by online hosts in response to court orders or allegations that content is illegal. Content is removed by the host following notice. Notice and take down is widely operated in relation to copyright infringement, as well as for libel and other illegal content. In United States and European Union law, notice and takedown is mandated as part of limited liability, or safe harbour, provisions for online hosts. As a condition for limited liability online hosts must expeditiously remove or disable access to content they host when they are notified of the alleged illegality.
The ancillary copyright for press publishers is a proposal incorporated in 2012 legislation proposed by the ruling coalition of the German government, led by Angela Merkel of the Christian Democratic Union (CDU), to extend publishers' copyrights. The bill was agreed by the Cabinet at the end of August 2012 and submitted to parliament on 14 November 2012. It was passed by the Bundestag on 1 March 2013 by 293 to 243, following substantial changes in the week before the vote.
The right to be forgotten (RTBF) is the right to have private information about a person be removed from Internet searches and other directories under some circumstances. The concept has been discussed and put into practice in several jurisdictions, including Argentina, the European Union (EU), and the Philippines. The issue has arisen from desires of individuals to "determine the development of their life in an autonomous way, without being perpetually or periodically stigmatized as a consequence of a specific action performed in the past."
Google has been involved in multiple lawsuits over issues such as privacy, advertising, intellectual property and various Google services such as Google Books and YouTube. The company's legal department expanded from one to nearly 100 lawyers in the first five years of business, and by 2014 had grown to around 400 lawyers. Google's Chief Legal Officer is Senior Vice President of Corporate Development David Drummond.
O'Kroley v. Fastcase, Inc.,, aff'd, No. 15-6336, is a U.S. court case concerning defamation in online search results. The plaintiff, Colin O'Kroley, alleged that Google's automated snippet algorithm created a defamatory search result by falsely implying that the plaintiff had been accused of indecency with a child. The District Court granted Google's motion to dismiss the case, and found that Google had immunity from the defamation charges under Section 230 of the Communications Decency Act, which protects interactive computer services from being held liable as a speaker or publisher for information provided by a third-party information content provider. On appeal, the United States Court of Appeals for the Sixth Circuit affirmed the District Court's decision.
Google's changes to its privacy policy on March 16, 2012 enabled the company to share data across a wide variety of services. These embedded services include millions of third-party websites that use AdSense and Analytics. The policy was widely criticized for creating an environment that discourages Internet-innovation by making Internet users more fearful and wary of what they do online.