Content-control software, commonly referred to as an Internet filter, is software that restricts or controls the content an Internet user is capable to access, especially when utilised to restrict material delivered over the Internet via the Web, e-mail, or other means. Content-control software determines what content will be available or be blocked.
Computer software, or simply software, is a collection of data or computer instructions that tell the computer how to work. This is in contrast to physical hardware, from which the system is built and actually performs the work. In computer science and software engineering, computer software is all information processed by computer systems, programs and data. Computer software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. Computer hardware and software require each other and neither can be realistically used on its own.
The Internet is the global system of interconnected computer networks that use the Internet protocol suite (TCP/IP) to link devices worldwide. It is a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.
The World Wide Web (WWW), commonly known as the Web, is an information system where documents and other web resources are identified by Uniform Resource Locators, which may be interlinked by hypertext, and are accessible over the Internet. The resources of the WWW may be accessed by users by a software application called a web browser.
Such restrictions can be applied at various levels: a government can attempt to apply them nationwide (see Internet censorship), or they can, for example, be applied by an ISP to its clients, by an employer to its personnel, by a school to its students, by a library to its visitors, by a parent to a child's computer, or by an individual user to their own computer.
Internet censorship is the control or suppression of what can be accessed, published, or viewed on the Internet enacted by regulators, or on their own initiative. Individuals and organizations may engage in self-censorship for moral, religious, or business reasons, to conform to societal norms, due to intimidation, or out of fear of legal or other consequences.
Self-censorship is the act of censoring or classifying one's own discourse. This is done out of fear of, or deference to, the sensibilities or preferences of others and without overt pressure from any specific party or institution of authority. Self-censorship is often practiced by film producers, film directors, publishers, news anchors, journalists, musicians, and other kinds of authors including individuals who use social media.
The motive is often to prevent access to content which the computer's owner(s) or other authorities may consider objectionable. When imposed without the consent of the user, content control can be characterised as a form of internet censorship. Some content-control software includes time control functions that empowers parents to set the amount of time that child may spend accessing the Internet or playing games or other computer activities.
In some countries, such software is ubiquitous. In Cuba, if a computer user at a government-controlled Internet cafe types certain words, the word processor or browser is automatically closed, and a "state security" warning is given.
The term "content control" is used on occasion by CNN,Playboy magazine, the San Francisco Chronicle , and The New York Times . However, several other terms, including "content filtering software", "secure web gateways", "censorware", "content security and control", "web filtering software", "content-censoring software", and "content-blocking software", are often used. "Nannyware" has also been used in both product marketing and by the media. Industry research company Gartner uses "secure web gateway" (SWG) to describe the market segment.
CNN is an American news-based pay television channel owned by AT&T's WarnerMedia. CNN was founded in 1980 by American media proprietor Ted Turner as a 24-hour cable news channel. Upon its launch, CNN was the first television channel to provide 24-hour news coverage, and was the first all-news television channel in the United States.
Playboy is an American men's lifestyle and entertainment magazine. It was founded in Chicago in 1953, by Hugh Hefner and his associates, and funded in part by a $1,000 loan from Hefner's mother. Notable for its centerfolds of nude and semi-nude models (Playmates), Playboy played an important role in the sexual revolution and remains one of the world's best-known brands, having grown into Playboy Enterprises, Inc. (PEI), with a presence in nearly every medium. In addition to the flagship magazine in the United States, special nation-specific versions of Playboy are published worldwide.
The San Francisco Chronicle is a newspaper serving primarily the San Francisco Bay Area of northern California in the United States. It was founded in 1865 as The Daily Dramatic Chronicle by teenage brothers Charles de Young and Michael H. de Young. The paper is owned by the Hearst Corporation, which bought it from the de Young family in 2000. It is the only major daily paper covering the city and county of San Francisco.
Companies that make products that selectively block Web sites do not refer to these products as censorware, and prefer terms such as "Internet filter" or "URL Filter"; in the specialized case of software specifically designed to allow parents to monitor and restrict the access of their children, "parental control software" is also used. Some products log all sites that a user accesses and rates them based on content type for reporting to an "accountability partner" of the person's choosing, and the term accountability software is used. Internet filters, parental control software, and/or accountability software may also be combined into one product.
Accountability software, or Internet accountability software, is software which monitors and reports Internet usage, in order to incentivize the avoidance of any content deemed objectionable. Accountability software may monitor Internet use on a personal computer, or Internet use by a specific user on a computer. These software applications then generate reports of Internet use viewable by a third party, sometimes called an accountability partner. It sometimes also doubles as content-control software.
Those critical of such software, however, use the term "censorware" freely: consider the Censorware Project, for example.The use of the term "censorware" in editorials criticizing makers of such software is widespread and covers many different varieties and applications: Xeni Jardin used the term in a 9 March 2006 editorial in The New York Times when discussing the use of American-made filtering software to suppress content in China; in the same month a high school student used the term to discuss the deployment of such software in his school district.
Xeni Jardin is an American weblogger, digital media commentator, and tech culture journalist. She is known for her position as co-editor of the collaborative weblog Boing Boing, as a former contributor to Wired magazine and Wired News, and as a correspondent for the National Public Radio show Day to Day. She has also worked as a guest technology news commentator for television networks such as PBS NewsHour, CNN, Fox News, MSNBC and ABC.
In general, outside of editorial pages as described above, traditional newspapers do not use the term "censorware" in their reporting, preferring instead to use less overtly controversial terms such as "content filter", "content control", or "web filtering"; The New York Times and the Wall Street Journal both appear to follow this practice. On the other hand, Web-based newspapers such as CNET use the term in both editorial and journalistic contexts, for example "Windows Live to Get Censorware."
Filters can be implemented in many different ways: by software on a personal computer, via network infrastructure such as proxy servers, DNS servers, or firewalls that provide Internet access. No solution provides complete coverage, so most companies deploy a mix of technologies to achieve the proper content control inline with their policies.
Internet service providers (ISPs) that block material containing pornography, or controversial religious, political, or news-related content en route are often utilized by parents who do not permit their children to access content not conforming to their personal beliefs. Content filtering software can, however, also be used to block malware and other content that is or contains hostile, intrusive, or annoying material including adware, spam, computer viruses, worms, trojan horses, and spyware.
Most content control software is marketed to organizations or parents. It is, however, also marketed on occasion to facilitate self-censorship, for example by people struggling with addictions to online pornography, gambling, chat rooms, etc. Self-censorship software may also be utilised by some in order to avoid viewing content they consider immoral, inappropriate, or simply distracting. A number of accountability software products are marketed as self-censorship or accountability software. These are often promoted by religious media and at religious gatherings.
[ citation needed ]
Utilizing a filter that is overly zealous at filtering content, or mislabels content not intended to be censored can result in over blocking, or over-censoring. Over blocking can filter out material that should be acceptable under the filtering policy in effect, for example health related information may unintentionally be filtered along with porn-related material because of the Scunthorpe problem. Filter administrators may prefer to err on the side of caution by accepting over blocking to prevent any risk of access to sites that they determine to be undesirable. Content-control software was mentioned as blocking access to Beaver College before its name change to Arcadia University.Another example was the filtering of Horniman Museum. As well, over-blocking may encourage users to bypass the filter entirely.
Whenever new information is uploaded to the Internet, filters can under block, or under-censor, content if the parties responsible for maintaining the filters do not update them quickly and accurately, and a blacklisting rather than a whitelisting filtering policy is in place.
Manywould not be satisfied with government filtering viewpoints on moral or political issues, agreeing that this could become support for propaganda. Many would also find it unacceptable that an ISP, whether by law or by the ISP's own choice, should deploy such software without allowing the users to disable the filtering for their own connections. In the United States, the First Amendment to the United States Constitution has been cited in calls to criminalise forced internet censorship. (See section below)
Without adequate governmental supervision, content-filtering software could enable private companies to censor as they please. (See Religious or political censorship, below). Government utilisation or encouragement of content-control software is a component of Internet Censorship (not to be confused with Internet Surveillance, in which content is monitored and not necessarily restricted). The governments of countries such as the People's Republic of China, and Cuba are current examples of countries in which this ethically controversial activity is alleged to have taken place.
In 1998, a United States federal district court in Virginia ruled (Loudoun v. Board of Trustees of the Loudoun County Library) that the imposition of mandatory filtering in a public library violates the First Amendment.
In 1996 the US Congress passed the Communications Decency Act, banning indecency on the Internet. Civil liberties groups challenged the law under the First Amendment, and in 1997 the Supreme Court ruled in their favor.Part of the civil liberties argument, especially from groups like the Electronic Frontier Foundation, was that parents who wanted to block sites could use their own content-filtering software, making government involvement unnecessary .
In the late 1990s, groups such as the Censorware Project began reverse-engineering the content-control software and decrypting the blacklists to determine what kind of sites the software blocked. This led to legal action alleging violation of the "Cyber Patrol" license agreement.They discovered that such tools routinely blocked unobjectionable sites while also failing to block intended targets. (See Over-zealous filtering, below).
Some content-control software companies responded by claiming that their filtering criteria were backed by intensive manual checking. The companies' opponents argued, on the other hand, that performing the necessary checking would require resources greater than the companies possessed and that therefore their claims were not valid.
The Motion Picture Association successfully obtained a UK ruling enforcing ISPs to use content-control software to prevent copyright infringement by their subscribers.
Many types of content-control software have been shown to block sites based on the religious and political leanings of the company owners. Examples include blocking several religious sites(including the Web site of the Vatican), many political sites, and homosexuality-related sites. X-Stop was shown to block sites such as the Quaker web site, the National Journal of Sexual Orientation Law, The Heritage Foundation, and parts of The Ethical Spectacle. CYBERsitter blocks out sites like National Organization for Women. Nancy Willard, an academic researcher and attorney, pointed out that many U.S. public schools and libraries use the same filtering software that many Christian organizations use. Cyber Patrol, a product developed by The Anti-Defamation League and Mattel's The Learning Company, has been found to block not only political sites it deems to be engaging in 'hate speech' but also human rights web sites, such as Amnesty International's web page about Israel and gay-rights web sites, such as glaad.org.
Content labeling may be considered another form of content-control software. In 1994, the Internet Content Rating Association (ICRA) — now part of the Family Online Safety Institute — developed a content rating system for online content providers. Using an online questionnaire a webmaster describes the nature of their web content. A small file is generated that contains a condensed, computer readable digest of this description that can then be used by content filtering software to block or allow that site.
ICRA labels come in a variety of formats.These include the World Wide Web Consortium's Resource Description Framework (RDF) as well as Platform for Internet Content Selection (PICS) labels used by Microsoft's Internet Explorer Content Advisor.
ICRA labels are an example of self-labeling. Similarly, in 2006 the Association of Sites Advocating Child Protection (ASACP) initiated the Restricted to Adults self-labeling initiative. ASACP members were concerned that various forms of legislation being proposed in the United States were going to have the effect of forcing adult companies to label their content.The RTA label, unlike ICRA labels, does not require a webmaster to fill out a questionnaire or sign up to use. Like ICRA the RTA label is free. Both labels are recognized by a wide variety of content-control software.
The Voluntary Content Rating (VCR) system was devised by Solid Oak Software for their CYBERsitter filtering software, as an alternative to the PICS system, which some critics deemed too complex. It employs HTML metadata tags embedded within web page documents to specify the type of content contained in the document. Only two levels are specified, mature and adult, making the specification extremely simple.
The use of Internet filters or content-control software varies widely in public libraries in the United States, since Internet use policies are established by the local library board. Many libraries adopted Internet filters after Congress conditioned the receipt of universal service discounts on the use of Internet filters through the Children's Internet Protection Act (CIPA). Other libraries do not install content control software, believing that acceptable use policies and educational efforts address the issue of children accessing age-inappropriate content while preserving adult users' right to freely access information. Some libraries use Internet filters on computers used by children only. Some libraries that employ content-control software allow the software to be deactivated on a case-by-case basis on application to a librarian; libraries that are subject to CIPA are required to have a policy that allows adults to request that the filter be disabled without having to explain the reason for their request.
Many legal scholars believe that a number of legal cases, in particular Reno v. American Civil Liberties Union , established that the use of content-control software in libraries is a violation of the First Amendment.The Children's Internet Protection Act [CIPA] and the June 2003 case United States v. American Library Association found CIPA constitutional as a condition placed on the receipt of federal funding, stating that First Amendment concerns were dispelled by the law's provision that allowed adult library users to have the filtering software disabled, without having to explain the reasons for their request. The plurality decision left open a future "as-applied" Constitutional challenge, however.
In November 2006, a lawsuit was filed against the North Central Regional Library District (NCRL) in Washington State for its policy of refusing to disable restrictions upon requests of adult patrons, but CIPA was not challenged in that matter.In May 2010, the Washington State Supreme Court provided an opinion after it was asked to certify a question referred by the United States District Court for the Eastern District of Washington: “Whether a public library, consistent with Article I, § 5 of the Washington Constitution, may filter Internet access for all patrons without disabling Web sites containing constitutionally-protected speech upon the request of an adult library patron.” The Washington State Supreme Court ruled that NCRL’s internet filtering policy did not violate Article I, Section 5 of the Washington State Constitution. The Court said: “It appears to us that NCRL’s filtering policy is reasonable and accords with its mission and these policies and is viewpoint neutral. It appears that no article I, section 5 content-based violation exists in this case. NCRL’s essential mission is to promote reading and lifelong learning. As NCRL maintains, it is reasonable to impose restrictions on Internet access in order to maintain an environment that is conducive to study and contemplative thought.” The case returned to federal court.
In March 2007, Virginia passed a law similar to CIPA that requires public libraries receiving state funds to use content-control software. Like CIPA, the law requires libraries to disable filters for an adult library user when requested to do so by the user.
The Australian Internet Safety Advisory Body has information about "practical advice on Internet safety, parental control and filters for the protection of children, students and families" that also includes public libraries.
NetAlert, the software made available free of charge by the Australian government, was allegedly cracked by a 16-year-old student, Tom Wood, less than a week after its release in August 2007. Wood supposedly bypassed the $84 million filter in about half an hour to highlight problems with the government's approach to Internet content filtering.
The Australian Government has introduced legislation that requires ISP's to "restrict access to age restricted content (commercial MA15+ content and R18+ content) either hosted in Australia or provided from Australia" that was due to commence from 20 January 2008, known as Cleanfeed.
Cleanfeed is a proposed mandatory ISP level content filtration system. It was proposed by the Beazley led Australian Labor Party opposition in a 2006 press release, with the intention of protecting children who were vulnerable due to claimed parental computer illiteracy. It was announced on 31 December 2007 as a policy to be implemented by the Rudd ALP government, and initial tests in Tasmania have produced a 2008 report. Cleanfeed is funded in the current budget, and is moving towards an Expression of Interest for live testing with ISPs in 2008. Public opposition and criticism have emerged, led by the EFA and gaining irregular mainstream media attention, with a majority of Australians reportedly "strongly against" its implementation. [ original research? ] Cleanfeed is a responsibility of Senator Conroy's portfolio.Criticisms include its expense, inaccuracy (it will be impossible to ensure only illegal sites are blocked) and the fact that it will be compulsory, which can be seen as an intrusion on free speech rights. Another major criticism point has been that although the filter is claimed to stop certain materials, the underground rings dealing in such materials will not be affected. The filter might also provide a false sense of security for parents, who might supervise children less while using the Internet, achieving the exact opposite effect.
In Denmark it is stated policy that it will "prevent inappropriate Internet sites from being accessed from children's libraries across Denmark.""'It is important that every library in the country has the opportunity to protect children against pornographic material when they are using library computers. It is a main priority for me as Culture Minister to make sure children can surf the net safely at libraries,' states Brian Mikkelsen in a press-release of the Danish Ministry of Culture."
Many libraries in the UK such as the British Libraryand local authority public libraries apply filters to Internet access. According to research conducted by the Radical Librarians Collective, at least 98% of public libraries apply filters; including categories such as "LGBT interest", "abortion" and "questionable". Some public libraries block Payday loan websites
Content filtering in general can "be bypassed entirely by tech-savvy individuals." Blocking content on a device "[will not]...guarantee that users won't eventually be able to find a way around the filter."
Some software may be bypassed successfully by using alternative protocols such as FTP or telnet or HTTPS, conducting searches in a different language, using a proxy server or a circumventor such as Psiphon. Also cached web pages returned by Google or other searches could bypass some controls as well. Web syndication services may provide alternate paths for content. Some of the more poorly designed programs can be shut down by killing their processes: for example, in Microsoft Windows through the Windows Task Manager, or in Mac OS X using Force Quit or Activity Monitor. Numerous workarounds and counters to workarounds from content-control software creators exist. Google services are often blocked by filters, but these may most often be bypassed by using https:// in place of http:// since content filtering software is not able to interpret content under secure connections (in this case SSL).
Many content filters have an option which allows authorized people to bypass the content filter. This is especially useful in environments where the computer is being supervised and the content filter is aggressively blocking Web sites that need to be accessed.[ citation needed ]
An encrypted VPN can be used as means of bypassing content control software, especially if the content control software is installed on an Internet gateway or firewall.
There many other ways in which to bypass a content control filter that include translation sites, establishing a remote connection with another computer that has no content control filter on it and altering the proxy settings of the browser.
Sometimes, an antivirus software with web protection may stop the content-control filter.[ citation needed ]
Some ISPs offer parental control options. Some offer security software which includes parental controls. Mac OS X v10.4 offers parental controls for several applications (Mail, Finder, iChat, Safari & Dictionary). Microsoft's Windows Vista operating system also includes content-control software.
Content filtering technology exists in two major forms: application gateway or packet inspection. For HTTP access the application gateway is called a web-proxy or just a proxy. Such web-proxies can inspect both the initial request and the returned web page using arbitrarily complex rules and will not return any part of the page to the requester until a decision is made. In addition they can make substitutions in whole or for any part of the returned result. Packet inspection filters do not initially interfere with the connection to the server but inspect the data in the connection as it goes past, at some point the filter may decide that the connection is to be filtered and it will then disconnect it by injecting a TCP-Reset or similar faked packet. The two techniques can be used together with the packet filter monitoring a link until it sees an HTTP connection starting to an IP address that has content that needs filtering. The packet filter then redirects the connection to the web-proxy which can perform detailed filtering on the website without having to pass through all unfiltered connections. This combination is quite popular because it can significantly reduce the cost of the system.
Gateway-based content control software may be more difficult to bypass than desktop software as the user does not have physical access to the filtering device. However, many of the techniques in the Bypassing filters section still work.
|Look up censorware in Wiktionary, the free dictionary.|
In computer networks, a proxy server is a server that acts as an intermediary for requests from clients seeking resources from other servers. A client connects to the proxy server, requesting some service, such as a file, connection, web page, or other resource available from a different server and the proxy server evaluates the request as a way to simplify and control its complexity. Proxies were invented to add structure and encapsulation to distributed systems.
IP address blocking is a configuration of a network service so that requests from hosts with certain IP addresses are rejected.
Peacefire is a U.S.-based website, with a registered address in Bellevue, Washington, dedicated to "preserving First Amendment rights for Internet users, particularly those younger than 18". It was founded in August 1996 by Bennett Haselton, who still runs it. The site's motto is, "You'll understand when you're younger."
Internet censorship in Australia is enforced by both the country's criminal law as well as voluntarily enacted by internet service providers. The Australian Communications and Media Authority (ACMA) has the power to enforce content restrictions on Internet content hosted within Australia, and maintain a blacklist of overseas websites which is then provided for use in filtering software. The restrictions focus primarily on child pornography, sexual violence, and other illegal activities, compiled as a result of a consumer complaints process.
Parental controls are features which may be included in digital television services, computer and video games, mobile devices and software that allow parents to restrict the access of content to their children. These controls were created to assist parents in their ability to restrict certain content viewable by their children. This may be content they deem inappropriate for their age, maturity level or feel is aimed more at an adult audience. Parental controls fall into roughly four categories: content filters, which limit access to age inappropriate content; usage controls, which constrain the usage of these devices such as placing time-limits on usage or forbidding certain types of usage; computer usage management tools, which enforces the use of certain software; and monitoring, which can track location and activity when using the devices.
The Children's Internet Protection Act (CIPA) obliges all pornographic sites to use age verification and prohibits public pornography.
Secure Web SmartFilter EDU, formerly known as Bess, is a brand of content-control software made by Secure Computing Corporation, which acquired maker N2H2 in 2003; it is usually used in libraries and schools. The main purpose of the system is as an Internet filter, blocking minors using the public computers from accessing web content deemed inappropriate by the local administrators of the system based on the Acceptable Use Policy of the organization. The system is not installed locally, but installs on the server between the users and the open Internet. This feature makes it harder to bypass, though it is not uncommon for students with more extensive computer knowledge to attempt to bypass the system. The system allows for teachers or administrators to temporarily bypass the system if they need to access sites that are blocked for educational purposes.
Freegate is a software application developed by Dynamic Internet Technology (DIT) that enables internet users from mainland China, North Korea, Syria, Vietnam, Iran, United Arab Emirates, among others, to view websites blocked by their governments. The program takes advantage of a range of proxy servers called Dynaweb. This allows users to bypass Internet firewalls that block web sites by using DIT's Peer-to-peer (P2P)-like proxy network system. FreeGate's anti-censorship capability is further enhanced by a new, unique encryption and compression algorithm in the versions of 6.33 and above. Dynamic Internet Technology estimates Freegate had 200,000 users in 2004. The maintainer and CEO of DIT is Bill Xia.
Cleanfeed is the name given to various privately administered ISP level content filtering systems operating in the United Kingdom and Canada, and as of May 2012 undergoing testing in Australia with a view to future mandatory implementation. These government-mandated programs originally attempted to block access to child pornography and abuse content located outside of the nation operating the filtering system.
Internet censorship in Iran has been increasing. In the first few years of the 21st century, Iran experienced a great surge in Internet usage. As of 2013, Iran has 46 million Internet users with a penetration rate of 61.57%.
Internet censorship in the United Kingdom is conducted under a variety of laws, judicial processes, administrative regulations and voluntary arrangements. It is achieved by blocking access to sites as well as the use of laws that criminalise publication or possession of certain types of material. These include English defamation law, the Copyright law of the United Kingdom, regulations against incitement to terrorism and child pornography.
United States v. American Library Association, 539 U.S. 194 (2003), was a decision in which the United States Supreme Court ruled that the United States Congress has the authority to require public schools and libraries receiving E-Rate discounts to install web filtering software as a condition of receiving federal funding. In a plurality opinion, the Supreme Court ruled that: 1.) public libraries' use of Internet filtering software does not violate their patrons' First Amendment free speech rights; 2.) The Children's Internet Protection Act is not unconstitutional.
On 5 December 2008, the Internet Watch Foundation (IWF), a British watchdog group, blacklisted content on the English Wikipedia related to Scorpions' 1976 studio album Virgin Killer, due to the presence of its controversial cover artwork, depicting a young girl posing nude, with a faux glass shatter obscuring her genitalia. The image was deemed to be "potentially illegal content" under English law which forbids the possession or creation of indecent photographs of children. The IWF's blacklist are used in web filtering systems such as Cleanfeed.
UltraSurf is a freeware Internet censorship circumvention product created by UltraReach Internet Corporation. The software bypasses Internet censorship and firewalls using an HTTP proxy server, and employs encryption protocols for privacy.
Internet censorship circumvention is the use of various methods and tools to bypass internet censorship.
As computing has become ubiquitous in schools so has a need to protect students from inappropriate content across the web, while also allowing students to use content-rich educational sites that can enhance the learning experience. Rather than simply blocking off large portions of the Internet, many schools are utilizing customizable web filtering systems that provide them with greater control over which sites are allowed and which are blocked.
The precise number of websites blocked in the United Kingdom is unknown. Blocking techniques vary from one Internet service provider (ISP) to another with some sites or specific URLs blocked by some ISPs and not others. Websites and services are blocked using a combination of data feeds from private content-control technology companies, government agencies, NGOs, court orders in conjunction with the service administrators who may or may not have the power to unblock, additionally block, appeal or recategorise blocked content.
The child abuse image content URL list is a list of URLs and image hashes provided by the Internet Watch Foundation to its partners to enable the blocking of child pornography & criminally obscene adult content in the UK and by major international technology companies.
Internet censorship in Switzerland is regulated by the Federal Supreme Court of Switzerland on a case by case basis. Internet services provided by the registered with BAKOM Internet service providers (ISPs) are subject to a "voluntary recommendation" by the Federal Supreme Court of Switzerland, which requires blocking of websites just after 18.12.2007. As of October 2015, this might change soon and additional topics like Online gambling are on the focus now.
|dead-url=(help)CS1 maint: multiple names: authors list (link)