SafeSearch

Last updated

SafeSearch is a feature in Google Search and Google Images, and later, Bing, that acts as an automated filter of pornography and other potentially offensive and inappropriate content. [1] [2]

Contents

On November 11th, 2009, Google introduced the ability for users with Google Accounts to lock on the SafeSearch level in Google's web and image searches. Once configured, a password is required to change the setting. [1]

On December 12, 2012, Google removed the option to turn off the filter entirely, requiring users to enter more specific search queries to access adult content. [3] [4] [5]

SafeSearch can be enforced by local area network administrators and ISPs by adding a DNS record. This is often done on school networks to prevent students from accessing pornographic content. [6] [7]

Users themselves can turn this setting on to filter out any inappropriate content. [8]

Effectiveness

A report by Harvard Law School's Berkman Center for Internet & Society stated that SafeSearch excluded many innocuous websites from search-result listings, including ones created by the White House, IBM, the American Library Association and Liz Claiborne. [9] On the other hand, many pornographic images slip through the filter, even when "innocent" search terms are entered. Blacklisting certain search terms is hindered by homographs (e.g., "beaver"), [10] blacklisting certain URLs is rendered ineffective by the changing URLs of porn sites, and software to tag images with copious amounts of flesh tones as pornographic content is problematic because there are a variety of skin tones and pictures of babies tend to have a lot of flesh tones. [11] Google's ability to filter porn has been an important factor in its relationship with the People's Republic of China. [12]

Related Research Articles

An Internet filter is software that restricts or controls the content an Internet user is capable to access, especially when utilized to restrict material delivered over the Internet via the Web, Email, or other means. Such restrictions can be applied at various levels: a government can attempt to apply them nationwide, or they can, for example, be applied by an Internet service provider to its clients, by an employer to its personnel, by a school to its students, by a library to its visitors, by a parent to a child's computer, or by an individual user to their own computers. The motive is often to prevent access to content which the computer's owner(s) or other authorities may consider objectionable. When imposed without the consent of the user, content control can be characterised as a form of internet censorship. Some filter software includes time control functions that empowers parents to set the amount of time that child may spend accessing the Internet or playing games or other computer activities.

A whitelist or allowlist is a list or register of entities that are being provided a particular privilege, service, mobility, access or recognition. Entities on the list will be accepted, approved and/or recognized. Whitelisting is the reverse of blacklisting, the practice of identifying entities that are denied, unrecognised, or ostracised.

The Australian Communications and Media Authority (ACMA) is an Australian government statutory authority within the Communications portfolio. ACMA was formed on 1 July 2005 with the merger of the Australian Broadcasting Authority and the Australian Communications Authority.

Internet censorship in Australia is enforced by both the country's criminal law as well as voluntarily enacted by internet service providers. The Australian Communications and Media Authority (ACMA) has the power to enforce content restrictions on Internet content hosted within Australia, and maintain a blocklist of overseas websites which is then provided for use in filtering software. The restrictions focus primarily on child pornography, sexual violence, and other illegal activities, compiled as a result of a consumer complaints process.

<span class="mw-page-title-main">Bing Videos</span> Microsofts Video Search via Bing

Bing Videos is a video search service and part of Microsoft's Bing search engine. The service enables users to search and view videos across various websites. Bing Videos was officially released on September 26, 2007 as Live Search Video, and rebranded as Bing Videos on June 1, 2009.

Google and its subsidiary companies, such as YouTube, have removed or omitted information from its services in order to comply with company policies, legal demands, and government censorship laws.

<span class="mw-page-title-main">Google Images</span> Image search engine by Google Inc.

Google Images is a search engine owned by Google that allows users to search the World Wide Web for images. It was introduced on July 12, 2001, due to a demand for pictures of the green Versace dress of Jennifer Lopez worn in February 2000. In 2011, reverse image search functionality was added.

Microsoft family features is a free set of features available on Windows 10 PC and Mobile that is bundled with the Windows 10, Home edition operating system. On July 17, 2020, Microsoft released Microsoft Family Safety on Google Play and App Store (iOS) as well. Starting in Windows 10, a Microsoft Account is required to use the Microsoft family features. A parent can manage settings for a child if both of their Microsoft Accounts are in the same family. When parents turn on settings for their child, these settings are applied to every device that the child logs into with that Microsoft Account.

<span class="mw-page-title-main">Internet censorship</span> Legal control of the internet

Internet censorship is the legal control or suppression of what can be accessed, published, or viewed on the Internet. Censorship is most often applied to specific internet domains but exceptionally may extend to all Internet resources located outside the jurisdiction of the censoring state. Internet censorship may also put restrictions on what information can be made internet accessible. Organizations providing internet access – such as schools and libraries – may choose to preclude access to material that they consider undesirable, offensive, age-inappropriate or even illegal, and regard this as ethical behavior rather than censorship. Individuals and organizations may engage in self-censorship of material they publish, for moral, religious, or business reasons, to conform to societal norms, political views, due to intimidation, or out of fear of legal or other consequences.

Internet censorship in the United Kingdom is conducted under a variety of laws, judicial processes, administrative regulations and voluntary arrangements. It is achieved by blocking access to sites as well as the use of laws that criminalise publication or possession of certain types of material. These include English defamation law, the Copyright law of the United Kingdom, regulations against incitement to terrorism and child pornography.

<span class="mw-page-title-main">Lapsiporno.info</span> Finnish website

Lapsiporno.info is a Finnish website opposed to Internet censorship. The website was founded and is maintained by software developer, researcher and Internet activist Matti Nikki, who previously attracted international attention by analyzing Sony BMG's digital rights management rootkit that the company's products automatically installed on users' computers. The website focuses on the internet censorship in Finland, its effectiveness, and the issues and problems related to it.

<span class="mw-page-title-main">Tumblr</span> Microblogging and social networking website

Tumblr is a microblogging and social networking website founded by David Karp in 2007 and currently owned by American company Automattic. The service allows users to post multimedia and other content to a short-form blog.

<span class="mw-page-title-main">Xtube</span> Pornographic video hosting and social networking site

Xtube was a Canadian pornographic video hosting service and social networking site based in Toronto, Ontario. It was established in 2006 and is notable for being the first adult community site to allow users to upload and share adult videos. Xtube was not a producer of pornography, instead it provided a platform for content uploaded by users. User-submitted content included pornographic videos, webcam models, pornographic photographs, and erotic literature, and incorporated social networking features.

<span class="mw-page-title-main">Internet Watch Foundation</span> Registered charity in Cambridge, England

The Internet Watch Foundation (IWF) is a global registered charity based in Cambridge, England. It states that its remit is "to minimise the availability of online sexual abuse content, specifically child sexual abuse images and videos hosted anywhere in the world and non-photographic child sexual abuse images hosted in the UK." Content inciting racial hatred was removed from the IWF's remit after a police website was set up for the purpose in April 2011. The IWF used to also take reports of criminally obscene adult content hosted in the UK. This was removed from the IWF's remit in 2017. As part of its function, the IWF says that it will "supply partners with an accurate and current URL list to enable blocking of child sexual abuse content". It has "an excellent and responsive national Hotline reporting service" for receiving reports from the public. In addition to receiving referrals from the public, its agents also proactively search the open web and deep web to identify child sexual abuse images and videos. It can then ask service providers to take down the websites containing the images or to block them if they fall outside UK jurisdiction.

Amateur pornography is a category of pornography that features models, actors or non-professionals performing without pay, or actors for whom this material is not their only paid modeling work. Reality pornography is professionally made pornography that seeks to emulate the style of amateur pornography. Amateur pornography has been called one of the most profitable and long-lasting genres of pornography.

<span class="mw-page-title-main">Internet Watch Foundation and Wikipedia</span> Blacklist of Wikipedia in the UK

On 5 December 2008, the Internet Watch Foundation (IWF), a British watchdog group, blacklisted content on the English Wikipedia related to Scorpions' 1976 studio album Virgin Killer, due to the presence of its controversial cover artwork, depicting a young girl posing nude, with a faux shattered-glass effect obscuring her genitalia. The image was deemed to be "potentially illegal content" under English law which forbids the possession or creation of indecent photographs of children. The IWF's blacklist are used in web filtering systems such as Cleanfeed.

<span class="mw-page-title-main">Internet pornography</span> Any pornography that is accessible over the Internet

Internet pornography is any pornography that is accessible over the Internet; primarily via websites, FTP connections, peer-to-peer file sharing, or Usenet newsgroups. The greater accessibility of the World Wide Web from the late 1990s led to an incremental growth of Internet pornography, the use of which among adolescents and adults has since become increasingly popular.

Enough Is Enough is an American non-profit organization whose stated purpose is to make the Internet safer for families and children. It carries out lobbying efforts in Washington, D.C., and played a role in the passage of the Communications Decency Act of 1996, the Child Online Protection Act of 1998, and the Children's Internet Protection Act of 2000. The group is based in the Commonwealth of Virginia. They sometimes refer to themselves acronymically as EIE.

The precise number of websites blocked in the United Kingdom is unknown. Blocking techniques vary from one Internet service provider (ISP) to another with some sites or specific URLs blocked by some ISPs and not others. Websites and services are blocked using a combination of data feeds from private content-control technology companies, government agencies, NGOs, court orders in conjunction with the service administrators who may or may not have the power to unblock, additionally block, appeal or recategorise blocked content.

The child abuse image content list is a list of URLs and image hashes provided by the Internet Watch Foundation to its partners to enable the blocking of child pornography & criminally obscene adult content in the UK and by major international technology companies.

References

  1. 1 2 Humphries, Matthew (November 12, 2009). "Google lets you lock SafeSearch with Strict mode". Geek.com. Archived from the original on April 5, 2019. Retrieved April 5, 2019.
  2. Schwartz, Barry (December 12, 2012). "Google Updates SafeSearch Filter In Image Search". Search Engine Land. Archived from the original on July 29, 2017. Retrieved April 5, 2019.
  3. Newton, Casey (December 12, 2012). "Google tweaks image search to make porn harder to find". CNET News. Archived from the original on August 27, 2021. Retrieved February 3, 2013.
  4. Matthew Panzarino (December 12, 2012). "Google tweaks image search algorithm and SafeSearch option to show less explicit content". TNW. Archived from the original on December 7, 2021. Retrieved February 3, 2013.
  5. Josh Wolford (December 16, 2012). "Google No Longer Allows You to Disable SafeSearch, and That Makes Google Search Worse". Web Pro News. Archived from the original on September 14, 2017. Retrieved February 3, 2013.
  6. "Lock SafeSearch for devices & networks you manage - Google Search Help". Google Help. Archived from the original on December 7, 2021. Retrieved 2021-01-30.
  7. "SafeSearch on: Indonesia IT Ministry instructs all ISPs to restrict pornography from search engines by tomorrow | Coconuts". Archived from the original on July 30, 2022. Retrieved July 15, 2022.
  8. "Your SafeSearch Setting". Google News . Retrieved 2024-05-29.
  9. Benjamin Edelman (April 14, 2003). "Empirical Analysis of Google SafeSearch". Harvard University. Archived from the original on March 4, 2013. Retrieved February 3, 2013.
  10. "Canada's The Beaver magazine renamed to end porn mix-up". AFP. January 12, 2010. Archived from the original on March 5, 2014. Retrieved February 3, 2013.
  11. Paul Festa (July 2, 2001). "Porn sneaks past search filters". CNET News. Archived from the original on July 1, 2015. Retrieved February 3, 2013.
  12. Fletcher, Owen (September 7, 2009). "Google porn filter gained China's thumbs-up". Network World. Archived from the original on October 26, 2021. Retrieved July 7, 2014.