Internet censorship in the United Kingdom is conducted under a variety of laws, judicial processes, administrative regulations and voluntary arrangements. It is achieved by blocking access to sites as well as the use of laws that criminalise publication or possession of certain types of material. These include English defamation law, the Copyright law of the United Kingdom, [1] regulations against incitement to terrorism [2] and child pornography.
British citizens have a negative right to freedom of expression under the common law. [3] In 2000, the United Kingdom required its courts to interpret as far as they can its domestic legislation compatibly with the European Convention, and the guarantee of freedom of expression it contains in Article 10. This was achieved under the Human Rights Act 1998 by requiring courts, if necessary, to strain the meaning of domestic law to be compatible with Convention rights and public authorities also have a duty to act compatibly with these rights. Where courts cannot interpret compatibly, some courts can issue a declaration of incompatibility; however the incompatible domestic legislation remains intact and it is for Parliament to decide whether to amend it to bring it into line with the Convention. Moreover there is a broad sweep of exceptions in the Convention.
The law provides for freedom of speech and press, and prohibits arbitrary interference with privacy, family, home, or correspondence, and the government routinely respects these rights and prohibitions. An independent press, an effective judiciary, and a functioning democratic political system combine to ensure freedom of speech and press. Individuals and groups routinely use the Internet, including e-mail, to express a wide range of views. [4]
Since the mid-2000s there has been a gradual shift toward increased surveillance and police measures in the UK. National security concerns, terrorism and crime, and issues regarding child protection have resulted in the state introducing extensive surveillance measures over online communications as well as filtering and tracking practices. In some cases these are encouraged or required by the state and used by state agencies. In others they are voluntarily implemented by private operators (e.g., internet service providers). [5]
The country was listed among the "Enemies of the Internet" in 2014 by Reporters Without Borders, [6] a category of countries with the highest level of internet censorship and surveillance that "mark themselves out not just for their capacity to censor news and information online but also for their almost systematic repression of Internet users". [7] Other major economies listed in this category include China, Iran, Pakistan, Russia and Saudi Arabia. [8]
In 2017 the Communications Select Committee set up an inquiry as to whether to, and how to, further regulate the Internet in the UK. [9]
Internet customers in the UK are prohibited from accessing a range of web sites by default, because they have their Internet access filtered by their ISPs. The filtering programme has applied to new ISP customers since the end of 2013, and has been extended to existing users on a rolling basis. A voluntary code of practice agreed by all four major ISPs [10] means that customers have to 'opt out' of the ISP filtering to gain access to the blocked content. [11] However, the complex nature of the active monitoring systems means that users cannot usually opt out of the monitoring and re-routing of their data traffic, something which may render their data security vulnerable. The range of content blocked by ISPs can be varied over time. [12] Categories blocked across the major ISPs include: Dating, Drugs, Alcohol and Tobacco, File sharing, Gambling, Games, Pornography, Nudity, Social networking, Suicide and Self-harm, Weapons and violence, Obscenity, Criminal Skills, Hate, Media Streaming, Fashion and Beauty, Gore, Cyberbullying, Hacking and Web-blocking circumvention tools
The idea for default filtering originated from manifesto commitments concerning "the commercialisation and sexualisation of childhood" given by the parties forming the Cameron–Clegg coalition government in 2010. [13] This was followed by a review (the Bailey Review) [14] and a consultation by the UK Council for Child Internet Safety (UKCCIS). [15] Campaigning by Claire Perry MP and the Daily Mail newspaper resulted in significant public support for the idea of Internet filtering for the purposes of child protection. [16] By 2013 there had already been considerable adoption of in-home filtering, with 43% of homes with children aged 5–15 having filters installed on their family computer. Nevertheless, Prime Minister David Cameron made it clear in July 2013 that his aim was to ensure that by the end of 2013 all ISPs would have a filtering system in place. [17] As a result, three of the Big 4 major ISPs (TalkTalk, Sky and BT [18] ) began applying default filtering to new customers in 2013 [19] with the fourth major ISP, Virgin, doing so in February 2014. [20] Default filtering of existing customers was implemented by all four major ISPs during 2014 with the aim of ensuring that the system applied to 95% of all households by the end of the year. [21]
TalkTalk already had content-control software available to comply with government requirements. Their HomeSafe internet filtering system was introduced in May 2011 as an opt-in product and was used for default filtering of new customers from March 2012. HomeSafe was praised by Cameron and is controlled and operated by the Chinese company Huawei. [22] After initial resistance [23] other ISPs had to commission new filtering systems to fulfil Government demands. Some smaller ISPs expressed their reluctance to take part in filtering, citing concerns over costs and civil liberties [24] but the government stated: "We expect the smaller ISPs to follow the lead being set by the larger providers". [25] Cameron said ISPs should choose their own preferred technical solution, but would be monitored to ensure filtering was done correctly. Nevertheless, the ISP Andrews & Arnold does not censor any of its Internet connection all its broadband packages guarantee a 12-month notice should it start to censor any of its traffic. [24]
In July 2014 Ofcom released a report into filter implementation and effectiveness across the fixed-line ISPs. At that point the Big 4 major fixed-line ISPs comprised 93% [26] of the broadband market. They were all mandating filters be enabled as default for new customers, but overall take-up figures were low, with BT (5%), Sky (8%) and Virgin (4%). The figure was higher for TalkTalk (36%) as there had already been significant take-up of its system during the preceding three years. [27] The industry average was 13%. [28] In January 2015 Sky went further, blocking all material deemed unsuitable for children under the age of 13 for any of its five million customers who had not already opted out. [29] In the same month Talk Talk announced that customers who had not chosen whether to activate the company's filtering system would have to opt out if they wished it to be turned off. [30] In January 2016 Sky began sending all new and existing customers an email asking if they want to turn the filter on. Those customers who ignore the email have the filter turned on automatically. [31]
The initial legal status of ISP web blocking was voluntary, although there were a number of attempts to introduce legislation to move it onto a mandatory footing. David Cameron first announced such legislation in July 2013 [32] but default filtering was rejected at the September 2013 conference of the Liberal Democrats (the Coalition Government's minor partner) [33] and no Government legislation to this effect occurred during the 2010-15 Parliament.
Prior to the 2015 United Kingdom general election both the opposition Labour Party and the governing Conservative Party said that, if elected, they would legislate on the issue. Labour said that it would introduce mandatory filters based on BBFC ratings if it believed that voluntary filtering by ISPs had failed.[ citation needed ] The Conservatives said that they would give an independent regulator such as ATVOD the legal power to compel internet service providers to block sites which failed to include effective age verification. [34] The Digital Economy Act 2017 placed the requirement for ISP filtering into law and introduced a requirement for ISPs to block pornographic sites with inadequate age verification. [35]
Proposals to create a single digital market for European Union (EU) member states include rules for net neutrality. These rules require that all internet traffic has to be treated equally, without blocking or slowing down certain data. Net neutrality guidelines were announced in August 2016 by the Body of European Regulators of Electronic Communications. [36] It was thought that the rules might restrict the legality of ISP filtering after 2016. [37] In May 2014 the government suggested it would veto European net neutrality legislation due to its conflict with web blocking programmes. [38] In May 2015, a leaked Council of the European Union document on the topic of net neutrality suggested users would have to opt into blocks, rather than opt out as per the current UK government's plans. John Carr of the UK Council for Child Internet Safety said of the proposals: "a major plank of the UK’s approach to online child protection will be destroyed at a stroke". [39] However, the requirement that a UK government adheres to EU rules on net neutrality may have disappeared when the United Kingdom left the European Union.
Wide-scale inadvertent "overblocking" has been observed since ISP default filtering was introduced at the end of 2013. Legitimate sites are regularly blocked by the filters of some UK ISPs and mobile operators. [40] In December 2013 the UK Council for Child Internet Safety met with ISPs, charities, representatives from government, the BBFC and mobile phone operators to seek ways to reduce the blocking of educational advice for young people. In January 2014 UKCCIS began constructing a whitelist of the charity-run educational sites for children that had been overblocked. The intention was to provide the list to ISPs to allow unblocking. [41]
Examples of overblocked categories reported include: [42]
The identification of overblocked sites is made particularly difficult by the fact that ISPs do not provide checking tools to allow website owners to determine whether their site is being blocked. [45] In July 2014 the Open Rights Group launched an independent checking tool blocked.org.uk, a revamp of their mobile blocking site to report details of blocking on different fixed line ISPs and mobile providers. The tool revealed that 19% of 100,000 popularly visited websites were being blocked (with significant variation between ISPs) although the percentage of sites hosting legal pornographic material is thought to be around 4%. [46] [47]
In 2019 an in-depth investigation into overblocking by the Open Rights Group and digital privacy site Top10VPN.com found that thousands of websites were being incorrectly blocked. These included relatively harmless example from industries such as wedding planning and photography, to more damaging and dangerous mistakes like official websites for charities, schools and mental health support. [48]
Significant underblocking has also been discovered, with ISPs failing to block up to 7% of adult sites tested. [49] A study commissioned by the European Commission's Safer Internet Programme which tested parental control tools showed that underblocking for adult content ranged from 5-35%. [50]
Proponents of internet filtering primarily refer to the need to combat the early sexualisation of children. The government believes that "broadband providers should consider automatically blocking sex sites, with individuals being required to opt in to receive them, rather than opt out and use the available computer parental controls." [51] [52] In 2010 communications minister Ed Vaizey was quoted as saying, "This is a very serious matter. I think it is very important that it's the ISPs that come up with solutions to protect children."
The Washington Post described the UK's ISP filtering systems as creating "some of the strictest curbs on pornography in the Western world". [53] There is no public scrutiny of the filtering lists. This creates the potential for them to be expanded to stifle dissent for political ends, as has happened in some other countries. The British Prime Minister of the time David Cameron stated that Internet users will have the option to turn the filters off, but no legislation exists to ensure that option will remain available. [54]
In March 2014, president Diane Duke of the United States-based Free Speech Coalition argued against the censorship rules at a London conference sponsored by Virgin Media. The discussion was titled "Switched on Families: Does the Online World Make Good Things Happen?". The panel included government representatives such as Member of Parliament Claire Perry, members of the press, and supporters of an open Internet such representatives from the UK Council for Child Internet Safety, the Family Online Safety Institute, and Big Brother Watch. [55] A report on the meeting was printed in The Guardian on 5 March 2014. [56] Duke was quoted as saying, "The filters Prime Minister Cameron supports block sexual health sites, they block domestic violence sites, they block gay and lesbian sites, they block information about eating disorders and a lot of information to which it's crucial young people have access. Rather than protect children from things like bullying and online predators, these filters leave children in the dark."
The Open Rights Group has been highly critical of the blocking programmes, especially mobile blocking and ISP default blocking. New Statesman magazine observed that overblocking means “the most vulnerable people in society are the most likely to be cut off from the help they need”. [57]
UK mobile phone operators began filtering Internet content in 2004 [58] when Ofcom published a "UK code of practice for the self-regulation of new forms of content on mobiles". [59] This provided a means of classifying mobile Internet content to enable consistency in filtering. All major UK operators now voluntarily filter content by default and when users try to access blocked content they are redirected to a warning page. This tells them that they are not able to access an 'over 18 status' Internet site and a filtering mechanism has restricted their access. Categories that are listed as blocked include: adult / sexually explicit, chat, criminal skills, drugs, alcohol and tobacco, gambling, hacking, hate, personal and dating, violence, and weapons. [60] Users who are adults may have the block lifted on request. [60]
Guidelines published by the Independent Mobile Classification Body were used by mobile operators to classify sites until the British Board of Film Classification took over responsibility in 2013. [61] Classification determines whether content is suitable for customers under 18 years old. [62] The default assumption is that a user is under 18.
The following content types are blocked from under 18's: [62]
Significant overblocking of Internet sites by mobile operators is reported, including the blocking of political satire, feminism and gay content. [63] Research by the Open Rights Group highlighted the widespread nature of unjustified site blocking. [64] In 2011 the group set up Blocked.org.uk, a website allowing the reporting of sites and services that are 'blocked' on their mobile network. [65] [66] The website received hundreds of reports [67] of the blocking of sites covering blogs, business, internet privacy and internet forums across multiple networks. The Open Rights Group also demonstrated that correcting the erroneous blocking of innocent sites can be difficult. No UK mobile operator provides an on-line tool for identifying blocked websites. The O2 Website status checker [68] [69] was available until the end of 2013 but was suspended in December [70] after it had been widely used to determine the extent of overblocking by O2. [71] Not only were civil liberties and computing sites being blocked, [72] but also Childline, the NSPCC, the Police. An additional opt-in whitelist service aimed at users under 12 years is provided by O2. The service only allows access to websites on a list of categories deemed suitable for that age group. [73]
The vast majority of the Internet access provided by Wi-Fi systems in public places in the UK is filtered with many sites being blocked. The filtering is done voluntarily by the six largest providers of public Wi-Fi: Arqiva, BT, Sky, Nomad Digital, Virgin and O2, who together are responsible for 90% public Wi-Fi. [74] The filtering was introduced as a result of an agreement put in place in November 2013 between the Government and the Wi-Fi providers. Pressure from the Government and the UK Council for Child Internet Safety [10] had already led Virgin and O2 to install filtering on the Wi-Fi systems on the London Underground [75] and McDonald's restaurants, [76] but half of all public Wi-Fi networks remained unfiltered in September 2013. [77]
"Overblocking" is a problem reported with public Wi-Fi filters. Research in September 2013 indicated that poorly programmed filters blocked sites when a prohibited tag appeared coincidentally within an unrelated word. Religious sites were blocked by nearly half of public Wi-Fi filters and sex education sites were blocked by one third. [78] In November 2013, there were complaints about the blocking of Gay websites that were not related to sex or nudity on the public Wi-Fi provided by train operating companies. The filtering was done by third party organisations and these were criticised for being both unidentified and unaccountable. Such blocking may breach the Equality Act 2010. The government arranged for the UK Council for Child Internet Safety to investigate whether filters were blocking advice to young people in areas such as sex education. [79]
Many libraries in the UK such as the British Library [80] and local authority public libraries [81] apply filters to Internet access. According to research conducted by the Radical Librarians Collective, at least 98% of public libraries apply filters; including categories such as "LGBT interest", "abortion" and "questionable". [82] Some public libraries block Payday loan websites [83] and Lambeth Council has called for other public Wi-fi providers to block these sites too. [84]
The majority of schools and colleges use filters to block access to sites which contain adult material, gambling and sites which contain malware. YouTube, Facebook and Twitter are often filtered by schools. Some universities also block access to sites containing a variety of material. [85] Many students often use proxy servers to bypass this. [86] Schools often censor pupils' Internet access in order to offer some protection against various perceived threats such as cyber-bullying and the perceived risk of grooming by paedophiles; as well as to maintain pupil attention during IT lessons. Examples of overblocking exist in the school context. For instance, in February 2014 the website of the Yes Scotland pro-independence campaign was blocked in a Glasgow school while the rival Better Together pro-union website was not blocked. [87]
The main focus of political censorship in UK law is concerned with the prevention of political violence. Hence incitement to ethnic or racial hatred is a criminal offence in the UK and those who create racist websites are liable to prosecution. Incitement to hatred against religions is an offence in England and Wales under the Racial and Religious Hatred Act 2006. Holocaust denial is not an offence per se unless it contravenes other laws. Other legal exceptions to the principle of freedom of speech include the following:
In September 2014 Home Secretary Theresa May proposed the introduction of Extremism Disruption Orders. These would allow judges to ban people who are deemed extremists (but who "do not break laws”) from broadcasting, protesting in designated places or posting messages on Social Media. [99]
There are a number of legal exceptions to freedom of speech in the United Kingdom that concern pornography. These include obscenity [100] and indecency, including corruption of public morals and outraging public decency. [101] The UK has a markedly different tradition of pornography regulation from that found in other Western countries. It was almost the only liberal democracy not to have legalised hardcore pornography during the 1960s and 1970s. Pre-existing laws, such as the Obscene Publications Act 1959, continued to make its sale illegal through the 1980s and 1990s. Additionally new laws were introduced to extend existing prohibitions. The Video Recordings Act 1984 required the BBFC to censor all video works before release. As a result, the UK became one of the few representative government countries where the sale of explicit pornography on video (and later DVD) was illegal (thus opening the market to unlicensed pornography shops which technically operated in defiance of the haphazardly enforced laws). [102]
The appearance of the Internet during the 1990s introduced unregulated access to hardcore pornography in the UK for the first time. The existing legal and regulatory framework came to be seen as insufficient and in the 21st century a number of measures have been introduced, including web blocking and additional criminal legislation. Nevertheless, the Obscene Publications Act is still in force, and it makes it illegal for websites that can be accessed from the UK without age restriction to contain certain types of adult content. [103]
The first attempts to regulate pornography on the Internet concerned child pornography. Legislation in the form of the Protection of Children Act 1978 already existed making it illegal to take, make, distribute, show or possess an indecent photograph or pseudo-photograph of someone under the age of 18. The R v Bowden case in 2000 established that downloading indecent images of children from the Internet constituted the offence of making, since doing so causes a copy of the image to exist which previously did not exist. [104]
Initial steps to restrict pornography on the Internet were taken by the UK police. In the 1990s they began to take a pro-active regulatory role with respect to the Internet, using existing legislation and working on a self-tasking basis. In August 1996, the Metropolitan Police Clubs & Vice Unit sent an open letter to the Internet Service Providers Association (ISPA) supplying them with a list of 132 Usenet discussion groups that they believed to contain pornographic images or explicit text and requesting that they ban access to them. [105] The list mainly included newsgroups which carried child pornography. Ian Taylor, the Conservative Science and Industry Minister, warned ISPs that the police would act against any company which provided their users with "pornographic or violent material". [106] Taylor went on to make it clear that there would be calls for legislation to regulate all aspects of the Internet unless service providers were seen to wholeheartedly "responsible self-regulation". Following this, a tabloid-style exposé of ISP Demon Internet appeared in the Observer newspaper, which alleged that Clive Feather (a director of Demon) "provides paedophiles with access to thousands of photographs of children being sexually abused". [107]
During the summer and autumn of 1996 the UK police made it known that they were planning to raid an ISP with the aim of launching a test case regarding the publication of obscene material over the Internet. The action of the UK police has been described as amounting to censorship without public or Parliamentary debate. It has been pointed out that the list supplied to ISPs by the police in August included a number of legitimate discussion groups concerned with legal sexual subjects. These contained textual material without pictures that would not be expected to infringe UK obscenity laws. [108]
The direct result of the 1996 campaign of threats and pressure was the setting up of the Internet Watch Foundation (IWF), an independent body to which the public could report potentially criminal Internet content, both child pornography and other forms of criminally obscene material. These reports would be passed on to ISPs and the Police as a ‘notice and takedown’ service for the removal of potentially illegal content hosted in the UK. It was intended that this arrangement would protect the internet industry from any criminal liability. The IWF was also intended to support the development of a website rating system. [109] [110] Demon Internet was a driving force behind the IWF's creation, and one of its directors, Clive Feather, became the IWF's first chairman. [111]
After 3 years of operation, the IWF was reviewed for the DTI and the Home Office by consultants KPMG and Denton Hall. Their report was delivered in October 1999 and resulted in a number of changes being made to the role and structure of the organisation, and it was relaunched in early 2000, endorsed by the government and the DTI, which played a "facilitating role in its creation", according to a DTI spokesman. [111]
At the time, Patricia Hewitt, then Minister for E-Commerce, said: "The Internet Watch Foundation plays a vital role in combating criminal material on the Net." To counter accusations that the IWF was biased in favour of the ISPs, a new independent chairman was appointed, Roger Darlington, former head of research at the Communication Workers Union. [111]
Between 2004 and 2006, BT Group introduced its Cleanfeed content blocking system technology [112] to implement 'section 97A' [113] orders. BT spokesman Jon Carter described Cleanfeed's function as "to block access to illegal Web sites that are listed by the Internet Watch Foundation", and described it as essentially a server hosting a filter that checked requested URLs for Web sites on the IWF list, and returning an error message of "Web site not found" for positive matches. [114] [115] [116] Cleanfeed is a silent content filtering system, which means that Internet users cannot ascertain whether they are being regulated by Cleanfeed, experiencing connection failures, or if the page really does not exist. The proportion of Internet service providers using Cleanfeed by the beginning of 2006 was 80% [112] and this rose to 95% by the middle of 2008. [117] In February 2009, the Government said that it was looking at ways to cover the final 5%. [118]
According to a small-sample survey conducted in 2008 by Nikolaos Koumartzis, an MA researcher at London College of Communication, the vast majority of UK based Internet users (90.21%) were unaware of the existence of Cleanfeed software. Moreover, nearly two thirds of the participants did not trust British Telecommunications or the IWF to be responsible for a silent censorship system in the UK. [119] A majority would prefer to see a message stating that a given site was blocked and to have access to a form for unblocking a given site.
Cleanfeed originally targeted only alleged child sexual abuse content identified by the Internet Watch Foundation. However, no safeguards exist to stop the secret list of blocked sites being extended to include sites unrelated to child pornography. This had led to criticism of Cleanfeed's lack of transparency which gives it considerable potential for broad censorship. Further, Cleanfeed has been used to block access to copyright-infringing websites after a court order in 2011 required BT to block access to NewzBin2. [120] This has led some to describe Cleanfeed as the most perfectly invisible censorship mechanism ever invented and to liken its powers of censorship to those employed currently by China. [121] There are risks that increasing Internet regulation will lead the Internet to be even more restricted in the future. [122] [123]
On 5 December 2008 the IWF system blacklisted a Wikipedia article on the Scorpions album Virgin Killer. A statement by the organisation's spokesman alleged that the album cover, displayed in the article, contained "a potentially illegal indecent image of a child under the age of 18". [124] Users of major ISPs, including Virgin Media, Be/O2/Telefónica, EasyNet/UK Online, Demon and Opal, were unable to access the content, despite the album cover being available unfiltered on other major sites including Amazon.co.uk, [124] and available for sale in the UK. [125] The system also started proxying users, who accessed any Wikipedia article, via a minimal number of servers, which resulted in site administrators having to block them from editing Wikipedia or creating accounts. [126] [127] On 9 December, the IWF removed the article from its blacklist, stating: "IWF's overriding objective is to minimise the availability of indecent images of children on the Internet, however, on this occasion our efforts have had the opposite effect." [128]
The Google search engine Google Search includes a SafeSearch filter which restricts the content returned by a search. In December 2012 the option to turn the filter off entirely was removed. [129]
In July 2013 Prime Minister David Cameron called on Internet search engines to "blacklist" certain search terms, so that they would bring up no results. Microsoft quickly responded by introducing a blacklist provided by the Child Exploitation and Online Protection Centre (CEOP). A 'pop-up' warning appears on the UK version of its search engine Bing when searches contravene the blacklist. [130] In November 2013 Google announced that 100,000 "blacklisted" search terms would no longer give any results, while 13,000 would produce a warning message. Child protection experts, including a former head of the CEOP, have warned that these measures will not help to protect children because most child pornography on the Internet is on hidden networks inaccessible through these search engines. [131]
In 2009 the UK Ministry of Justice claimed that legislation was needed to reduce the availability of hardcore paedophilic cartoon pornography on the internet. [132] The decision was made to make possession of cartoon pornography depicting minors illegal in the UK. The Coroners and Justice Act 2009 (sections 62–68), which came into force on 6 April 2010, [133] created an offence in England, Wales and Northern Ireland of possession of a prohibited image of a child. [134] The maximum penalty is three years imprisonment and listing on the sex offender registry. [135]
A prohibited cartoon image is defined as one which involves a minor in situations which are pornographic and "grossly offensive, disgusting or otherwise of an obscene character". The Act makes it illegal to own any picture depicting under-18s participating in sexual activities, or depictions of sexual activity in the presence of someone under 18 years old. The definition of a "child" in the Act includes depictions of 16- and 17-year-olds who are over the age of consent in the UK, as well as any adults where the "predominant impression conveyed" is of a person under the age of 18. "The law has been condemned by a coalition of graphic artists, publishers, and MPs, fearing it will criminalise graphic novels such as Lost Girls and Watchmen ." [132]
Calls for violent adult pornography sites to be shut down began in 2003, after the murder of Jane Longhurst by Graham Coutts, a man who said he had an obsession with Internet pornography. [136] Jane Longhurst's mother and sister also campaigned to tighten laws regarding pornography on the Internet. In response the government announced plans to crack down on sites depicting rape, strangulation, torture and necrophilia. [137] [138] [139] However, in August 2005 the Government announced that instead of targeting production or publication, it planned to criminalise private possession of what the Government now termed "extreme pornography". [140] [141] This was defined as real or simulated examples of certain types of sexual violence as well as necrophilia and bestiality. The passing of the Criminal Justice and Immigration Act 2008 resulted in the possession of "extreme pornographic images" becoming illegal in England and Wales as of January 2009. [142]
The law has been criticised for criminalising images where no crime took place in their creation. [143] Additionally, the law's placing of liability on consumers rather than producers has been criticised for creating a power imbalance between the individual and the state. There has never been a legal challenge to the law in the UK as the cost of doing so would be beyond most individuals. [144] In 2011, there were over 1300 prosecutions under the law, compared to the Government estimate of 30 cases a year. [145] [146]
In 2004 in Scotland, a committee of Members of the Scottish Parliament backed a call to ban adult pornography as the Equal Opportunities Committee supported a petition claiming links between porn and sexual crimes and violence against women and children. [147] A spokeswoman said "While we have no plans to legislate we will, of course, continue to monitor the situation." In 2007, MSPs looked again at criminalising adult pornography, in response to a call from Scottish Women Against Pornography for pornography to be classified as a hate crime against women. This was opposed by Feminists Against Censorship. [148] [149] In September 2008, Scotland announced its own plans to criminalise possession of what it termed "extreme" adult pornography, but extending the law further, including depictions of rape imagery. [150] These plans became law with the Criminal Justice and Licensing (Scotland) Act 2010.
In July 2013, David Cameron proposed that pornography which depicts rape (including simulations involving consenting adults) should become illegal in England and Wales bringing the law in line with that of Scotland. [151] These plans became law with the Criminal Justice and Courts Act 2015.
In January 2019, the Crown Prosecution Service amended their advice regarding prosecutions under obscenity laws of depictions of acts that are themselves legal to perform, stating that they "do not propose to bring charges based on material that depicts consensual and legal activity between adults, where no serious harm is caused and the likely audience is over the age of 18". [152]
The Audiovisual Media Services Regulations 2014 require that the online streaming of videos (known as Video On Demand or VOD) in the UK conforms to the BBFC R18 certificate regulations which had previously only restricted those sold in licensed sex shops. [153] The regulations were first announced in July 2013 by David Cameron. [151]
The UK regulator of VOD is Ofcom, which replaced ATVOD as the regulator from the beginning of 2016. [154] During its tenure as regulator ATVOD regularly instructed UK websites to comply with its rules and failure to do so resulted in Ofcom issuing a fine or shutting down a website. [103] [155] It is a criminal offence not to restrict access to adult VOD content to those aged over 18, by means such as requiring the user to provide credit card details. [156]
In March 2014 ATVOD proposed new legislation that would introduce a licensing system for all UK adult content providers. The verification of customers' ages would be a condition of granting a license. Furthermore, there would be a legal requirement on financial institutions to block the customer payments of unlicensed adult websites. [157]
An amendment to the Criminal Justice and Courts Act 2015 creates a specific offence in England and Wales of distributing a private sexual image of someone without their consent and with the intention of causing them distress (commonly called "revenge porn"). The maximum custodial sentence is two years. The law received Royal Assent and came into effect in February 2015. [158]
Pressure for a change in the law came from reports in April 2014 by UK charities including The National Stalking Helpline, Women's Aid, and the UK Safer Internet Centre that the use of revenge porn websites had increased. [159] Women's Aid Charity Chief Executive Polly Neate stated, "To be meaningful, any attempt to tackle revenge porn must also take account of all other kinds of psychological abuse and controlling behaviour, and revenge porn is just another form of coercive control. That control is central to domestic violence, which is why we're campaigning for all psychological abuse and coercive control to be criminalised". In July, Minister of Justice Chris Grayling announced plans to "take appropriate action" to address revenge porn in Britain. [159] A House of Lords Committee, in a report on social media crime, subsequently called for clarification from the DPP as to when revenge porn becomes a crime. [160] [161]
R v Walker, sometimes called the "Girls (Scream) Aloud Obscenity Trial", was the first prosecution for written material under Section 2(1) of the Obscene Publications Act in nearly two decades. [162] It involved the prosecution of Darryn Walker for posting a story entitled "Girls (Scream) Aloud" on an internet erotic story site in 2008. The story was a fictional written account describing the kidnap, rape and murder of pop group Girls Aloud. [163] It was reported to the IWF who passed the information on to Scotland Yard’s Obscene Publications Unit. During the trial the prosecution claimed that the story could be "easily accessed" by young fans of Girls Aloud. However, the defence demonstrated that it could only be located by those specifically searching for such material. As a result, the case was abandoned and the defendant cleared of all charges. [164] [165]
In October 2013 a press exposé resulted in a number of on-line e-book retailers removing adult fiction titles including descriptions of rape, incest or bestiality from their download catalogues. [166]
With the passing of the Digital Economy Act 2017, the United Kingdom became the first country to pass a law containing a legal mandate on the provision of an Internet age verification system. Under the act, websites that publish pornography on a commercial basis would have been required to implement a "robust" age verification system. [167] [168] The British Board of Film Classification (BBFC) was charged with enforcing this legislation. [169] [170] [171] After a series of setbacks, the planned scheme was eventually abandoned in 2019. [172]
Social media in the United Kingdom are subject to a number of laws which restrict the range of comments that users can make.
Section 1 of the Malicious Communications Act 1988 criminalises sending another any article which is indecent or grossly offensive with an intent to cause distress or anxiety (which has been used to prohibit speech of a racist or anti-religious nature). [173] [174]
Section 127 of the Communications Act 2003 makes it an offence to send a message that is grossly offensive or of an indecent, obscene or menacing character over a public electronic communications network. [175] The section replaced section 43 of the Telecommunications Act 1984 and is drafted as widely as its predecessor. [176] The section has controversially been widely used to prosecute users of social media. [177] On 19 December 2012, to strike a balance between freedom of speech and criminality, the Director of Public Prosecutions issued interim guidelines, clarifying when social messaging is eligible for criminal prosecution under UK law. Revisions to the interim guidelines were issued on 20 June 2013 following a public consultation [178] and have been updated since then.
The fact that existing libel laws apply to Internet publishing was established by the Keith-Smith v Williams case of 2006, but the time limit of one year after publication for libel suits does not apply to Internet publishing because each incidence of material being accessed on the Internet is defined as a new publication. As a result, many newspapers and journals do not publish controversial material in their on-line archives due to a fear of potential libel suits. [179] In addition, individuals without the financial means to defend themselves against libel suits can also be reluctant to publish controversial material on-line. With older forms of publishing the media companies themselves had legal responsibility for posts but with social media such as Twitter it is the users and not their online hosts who have legal responsibility. [180]
Individuals who are defamed online may also not have the financial means to seek legal redress. The UK Ministry of Justice drew up plans in 2008 to give such individuals access to cheap low-cost legal recourse but these proposals were never implemented. [181] Instead the Defamation Act 2013 (which came into force on 1 January 2014 [182] ) reformed libel law to allow new defences and introduce a requirement for claimants to show that they have suffered serious harm. [183] The intention behind the reform was to make it harder to bring libel suits in Britain. [184]
Exceptions to freedom of speech include prior restraint, restrictions on court reporting including names of victims and evidence and prejudicing or interfering with court proceedings, [185] [186] prohibition of post-trial interviews with jurors, [186] and scandalising the court by criticising or murmuring judges. [186] [187]
The use of social media to comment on a legal case can constitute contempt of court, resulting in the fining or imprisonment of the social media user. This can happen if a trial is seriously prejudiced as a result of a comment, such as a breach of jury confidentiality, resulting in the need for a retrial. [188] It can also happen if the identity of an individual is publicly revealed when their identity is protected by a court. For instance, victims of rape and serious sexual offences are entitled as a matter of law to lifelong anonymity in the media under the Sexual Offences Act 1992, even if their name has been given in court. [189]
There have been a number of instances of users of social media being prosecuted for contempt of court. In 2012 the R v Evans and McDonald rape trial generated more than 6,000 tweets, with some people naming his victim on Twitter and other social media websites. Nine people were prosecuted. [190] In February 2013, the Attorney General's Office instituted contempt of court proceedings against three men who used Twitter and Facebook to publish photographs which allegedly showed the two murderers of the toddler James Bulger as adults. This use of social media breached a worldwide injunction that prevented publication of anything that could identify the pair. [191]
In December 2013 the Attorney General's Office set up a Twitter account to provide advice to individuals using social media. The advice is intended to help individuals avoid committing contempt of court when commenting on legal cases. The professional news media routinely receive such advice. [192]
On 11 August 2011, following the widespread riots in England, British Prime Minister David Cameron said that Theresa May, the Home Secretary, would meet with executives of the Web companies Facebook and Twitter, as well as Research In Motion, maker of the BlackBerry smartphone, to discuss possible measures to prevent troublemakers from using social media and other digital communications tools. [193] During a special debate on the riots, Cameron told Parliament:
Everyone watching these horrific actions will be struck by how they were organised via social media. Free flow of information can be used for good. But it can also be used for ill. And when people are using social media for violence we need to stop them. So we are working with the police, the intelligence services and industry to look at whether it would be right to stop people communicating via these Web sites and services when we know they are plotting violence, disorder and criminality”.
Critics[ who? ] said that the British government was considering policies similar to those it has criticised in totalitarian and one-party states. [194] [ better source needed ] And in the immediate aftermath of the 2011 England riots, Iran, often criticised by the West for restricting the Internet and curbing free speech, offered to "send a human rights delegation to Britain to study human rights violations in the country". [195]
On 25 August 2011 British officials and representatives of Twitter, Facebook and BlackBerry met privately to discuss voluntary ways to limit or restrict the use of social media to combat crime and periods of civil unrest. [196] The government was seeking ways to crack down on networks being used for criminal behavior, but was not seeking any additional powers and had no intention of restricting Internet services. [197] It was not clear what new measures, if any, would be taken as a result of the meeting.
The practice of file sharing constitutes a breach of the Copyright, Designs and Patents Act 1988 if it is performed without the permission of a copyright holder. Courts in the UK routinely issue injunctions restricting access to file sharing information published on the Internet. The British Phonographic Industry represents the interests of British record companies and along with the British Video Association encourages UK governments to regulate and legislate to reduce copyright infringement. As a result, the Digital Economy Act was passed in 2010. Further legislation has been suggested, such as the 2014 proposal for a general law to prevent search engines from returning file-sharing websites as search results. [198]
The Digital Economy Act 2010 is the only Internet-specific legislation regarding copyright in the UK. Progress on the implementation of the Act was slow, [199] [200] and in the end, its measures were never passed by Parliament.
The Act had proposed a Code to be drafted by Ofcom and implemented by Parliament, containing provisions restricting the downloading of copyrighted material from the Internet. Under the Act, warning letters would have been sent to Internet users suspected of downloading copyright-infringing material (provided their ISP has more than 400,000 customers), and a customer receiving three such letters in one year would be recorded by their service provider and could have been subject to a civil claim by the copyright holder under the Copyright, Designs and Patents Act 1988 (the copyright holder having first sought the subscriber's identity using a court order). After these provisions have been in force for a year, additional rules could have then been applied, requiring ISPs to reduce the download speed of repeat offenders and in some cases disconnect their Internet supply. The Act originally allowed the Secretary of State to order the blocking of websites which provided material that infringed copyright, although this section was dropped following the successful use of court orders to block websites. Commentators debate the practicality of such controls and the ability of the UK government to exact control. [201]
It is an established procedure in the UK for rights-holders to use 'Section 97' [202] court orders to require ISPs to block copyright-infringing sites. [203] For instance, court orders obtained by the BPI in October 2013 resulted in the blocking of 21 file-sharing sites including FilesTube and Torrentz. [204] There is a private agreement in principle between leading ISPs and rights holders, made with encouragement from government, to quickly restrict access to websites when presented with court orders. [205] The court orders are not made public [206] and "overblocking" is sometimes reported, such as the accidental blocking of the Radio Times , Crystal Palace F.C., Taylor Swift and over 100 other websites in August 2013. [207] [208]
The practice originated as a result of a court order applied against an incidence of copyright infringement that was taken out by the Motion Picture Association in December 2010 at the request of Hollywood studios. The Association applied for an injunction to block access to NewzBin2, a site which provided a search service for UseNet content, indexing downloads of copyrighted content including movies and other material shared without permission. The application was lodged against BT, the largest Internet service provider in the United Kingdom with around six million customers. It required BT to use Cleanfeed to block its customers' access to the site. [120] In July 2011 the High Court of Justice granted the injunction [209] [210] and in October 2011 BT was ordered to block access to the website within fourteen days, [211] the first ruling of its kind under UK copyright law. [212] The precedent set was described by the Open Rights Group as "dangerous". [213]
BT did not appeal against the ruling and put the required block in place on 2 November 2011. Subsequent attempts to access the site from a BT IP address were met with the message "Error – site blocked". [214] Newzbin released client software to circumvent the BT blocking, [215] using encryption and the Tor network. [216] Newzbin claimed that over 90% of its active UK users had downloaded its workaround software making the BT block ineffective. However, further court orders resulted in Sky blocking access to Newzbin in December 2011 [217] and Virgin Media blocking access to the site in August 2012. [218] On 28 November 2012 Newzbin announced the closure of its indexing service.
Meanwhile, in May 2012 the High Court ordered the blocking of The Pirate Bay by UK ISPs to prevent further copyright infringing movie and music downloads from the website. [219] [220] The blocks were said to be quickly bypassed and a spokesman for The Pirate Party said public interest in the service following the ban had boosted traffic to the party's website. [221] In December 2012, the British Phonographic Industry (BPI) threatened legal action [222] against The Pirate Party after the party refused demands sent at the end of November to remove their proxy to The Pirate Bay. [223]
In September 2013 an Ofcom survey revealed that 2% of Internet users are responsible for 74% of all copyright-infringing downloads in the UK, and that 29% of all downloads are of content which violates copyright. [224]
In October 2014 the first blocking order against trademark infringing consumer goods was passed against the major UK ISPs by Richemont, Cartier International and Montblanc to block several domains. [225]
An Internet filter is software that restricts or controls the content an Internet user is capable to access, especially when utilized to restrict material delivered over the Internet via the Web, Email, or other means. Such restrictions can be applied at various levels: a government can attempt to apply them nationwide, or they can, for example, be applied by an Internet service provider to its clients, by an employer to its personnel, by a school to its students, by a library to its visitors, by a parent to a child's computer, or by an individual user to their own computers. The motive is often to prevent access to content which the computer's owner(s) or other authorities may consider objectionable. When imposed without the consent of the user, content control can be characterised as a form of internet censorship. Some filter software includes time control functions that empowers parents to set the amount of time that child may spend accessing the Internet or playing games or other computer activities.
Internet censorship in Australia is enforced by both the country's criminal law as well as voluntarily enacted by internet service providers. The Australian Communications and Media Authority (ACMA) has the power to enforce content restrictions on Internet content hosted within Australia, and maintain a blocklist of overseas websites which is then provided for use in filtering software. The restrictions focus primarily on child pornography, sexual violence, and other illegal activities, compiled as a result of a consumer complaints process.
Internet censorship in India is done by both central and state governments. DNS filtering and educating service users in suggested usages is an active strategy and government policy to regulate and block access to Internet content on a large scale. Measures for removing content at the request of content creators through court orders have also become more common in recent years. Initiating a mass surveillance government project like Golden Shield Project is an alternative that has been discussed over the years by government bodies.
Cleanfeed is the name given to various privately administered ISP level content filtering systems operating in the United Kingdom and Canada, and as of May 2012 undergoing testing in Australia with a view to future mandatory implementation. These government-mandated programs originally attempted to block access to child pornography and abuse content located outside of the nation operating the filtering system.
Internet censorship is the legal control or suppression of what can be accessed, published, or viewed on the Internet. Censorship is most often applied to specific internet domains but exceptionally may extend to all Internet resources located outside the jurisdiction of the censoring state. Internet censorship may also put restrictions on what information can be made internet accessible. Organizations providing internet access – such as schools and libraries – may choose to preclude access to material that they consider undesirable, offensive, age-inappropriate or even illegal, and regard this as ethical behavior rather than censorship. Individuals and organizations may engage in self-censorship of material they publish, for moral, religious, or business reasons, to conform to societal norms, political views, due to intimidation, or out of fear of legal or other consequences.
Most Internet censorship in Thailand prior to the September 2006 military coup d'état was focused on blocking pornographic websites. The following years have seen a constant stream of sometimes violent protests, regional unrest, emergency decrees, a new cybercrimes law, and an updated Internal Security Act. Year by year Internet censorship has grown, with its focus shifting to lèse majesté, national security, and political issues. By 2010, estimates put the number of websites blocked at over 110,000. In December 2011, a dedicated government operation, the Cyber Security Operation Center, was opened. Between its opening and March 2014, the Center told ISPs to block 22,599 URLs.
Internet censorship in Singapore is carried out by the Infocomm Media Development Authority (IMDA). Internet services provided by the three major Internet service providers (ISPs) are subject to regulation by the MDA, which requires blocking of a symbolic number of websites containing "mass impact objectionable" material, including Playboy, YouPorn and Ashley Madison. The civil service, tertiary institutions and Institute of Technical Education has its own jurisdiction to block websites displaying pornography, information about drugs and online piracy.
Censorship in Denmark has been prohibited since 1849 by the Constitution:
§ 77: Any person shall be at liberty to publish his ideas in print, in writing, and in speech, subject to his being held responsible in a court of law. Censorship and other preventive measures shall never again be introduced.
Lapsiporno.info is a Finnish website opposed to Internet censorship. The website was founded and is maintained by software developer, researcher and Internet activist Matti Nikki, who previously attracted international attention by analyzing Sony BMG's digital rights management rootkit that the company's products automatically installed on users' computers. The website focuses on the internet censorship in Finland, its effectiveness, and the issues and problems related to it.
The Internet Watch Foundation (IWF) is a global registered charity based in Cambridge, England. It states that its remit is "to minimise the availability of online sexual abuse content, specifically child sexual abuse images and videos hosted anywhere in the world and non-photographic child sexual abuse images hosted in the UK." Content inciting racial hatred was removed from the IWF's remit after a police website was set up for the purpose in April 2011. The IWF used to also take reports of criminally obscene adult content hosted in the UK. This was removed from the IWF's remit in 2017. As part of its function, the IWF says that it will "supply partners with an accurate and current URL list to enable blocking of child sexual abuse content". It has "an excellent and responsive national Hotline reporting service" for receiving reports from the public. In addition to receiving referrals from the public, its agents also proactively search the open web and deep web to identify child sexual abuse images and videos. It can then ask service providers to take down the websites containing the images or to block them if they fall outside UK jurisdiction.
On 5 December 2008, the Internet Watch Foundation (IWF), a British watchdog group, blacklisted content on the English Wikipedia related to Scorpions' 1976 studio album Virgin Killer, due to the presence of its controversial cover artwork, depicting a young girl posing nude, with a faux shattered-glass effect obscuring her genitalia. The image was deemed to be "potentially illegal content" under English law which forbids the possession or creation of indecent photographs of children. The IWF's blacklist is used in web filtering systems such as Cleanfeed.
Internet censorship in Ireland is a controversial issue with the introduction of a graduated response policy in 2008 followed by an effort to block certain file sharing sites starting in February 2009. Beyond these issues there are no government restrictions on access to the Internet or credible reports that the government monitored e-mail or Internet chat rooms. Individuals and groups could engage in the expression of views via the Internet, including by e-mail. Irish law provides for freedom of speech including for members of the press, and the government generally respects these rights in practice. An independent press, an effective judiciary, and a functioning democratic political system act jointly to ensure freedom of speech and of the press.
Internet censorship in New Zealand refers to the New Zealand Government's system for filtering website traffic to prevent Internet users from accessing certain selected sites and material. While there are many types of objectionable content under New Zealand law, the filter specifically targets content depicting the sexual abuse or exploitation of children and young persons. The Department of Internal Affairs runs the filtering system, dubbed the Digital Child Exploitation Filtering System (DCEFS). It is voluntary for Internet Service Providers (ISPs) to join.
There is medium internet censorship in France, including limited filtering of child pornography, laws against websites that promote terrorism or racial hatred, and attempts to protect copyright. The "Freedom on the Net" report by Freedom House has consistently listed France as a country with Internet freedom. Its global ranking was 6 in 2013 and 12 in 2017. A sharp decline in its score, second only to Libya was noted in 2015 and attributed to "problematic policies adopted in the aftermath of the Charlie Hebdo terrorist attack, such as restrictions on content that could be seen as 'apology for terrorism,' prosecutions of users, and significantly increased surveillance."
Pornography in Asia is pornography created in Asia, watched in Asia, or consumed or displayed in other parts of the world as one or more genres of Asian porn.
The precise number of websites blocked in the United Kingdom is unknown. Blocking techniques vary from one Internet service provider (ISP) to another with some sites or specific URLs blocked by some ISPs and not others. Websites and services are blocked using a combination of data feeds from private content-control technology companies, government agencies, NGOs, court orders in conjunction with the service administrators who may or may not have the power to unblock, additionally block, appeal or recategorise blocked content.
The child abuse image content list is a list of URLs and image hashes provided by the Internet Watch Foundation to its partners to enable the blocking of child pornography & criminally obscene adult content in the UK and by major international technology companies.
This list of Internet censorship and surveillance in Europe provides information on the types and levels of Internet censorship and surveillance that is occurring in countries in Europe.
This list of Internet censorship and surveillance in Asia provides information on the types and levels of Internet censorship and surveillance that is occurring in countries in Asia
{{cite journal}}
: Cite journal requires |journal=
(help){{cite journal}}
: Cite journal requires |journal=
(help)Advocating the abolition of the monarchy in print is lawful and no one can be prosecuted for it, despite a 19th-century act still on the statute book that bans it…
In addition, Ahmad admitted three counts of collecting information likely to be of use to a terrorist, including the al-Qaeda publication Inspire. This is the first successful prosecution for possessing the online jihadist magazine.
{{cite web}}
: CS1 maint: numeric names: authors list (link)The UK's music industry body is set to take the Pirate Party UK to court in a dispute over offering access to banned site The Pirate Bay.
The UK's music industry body is demanding that a service offering a workaround to access banned site The Pirate Bay is shut down by its owner.