Counter-Terrorism Internet Referral Unit

Last updated

The Counter-Terrorism Internet Referral Unit (CTIRU) was set up in 2010 by ACPO (and run by the Metropolitan Police) to remove unlawful terrorist material content from the Internet, with a specific focus on UK based material. CTIRU works with internet platforms to identify content which breaches their terms of service and requests that they remove the content on a voluntary basis. CTIRU also compile a list of URLs for material hosted outside the UK which are blocked on networks of the public estate.

Contents

To date (as of December 2017), CTIRU is linked to the removal of 300,000 pieces of "illegal terrorist material" from the internet. [1]

Scope

The December 2013 report of the Prime Minister's Extremism Taskforce [2] said that it would "work with internet companies to restrict access to terrorist material online which is hosted overseas but illegal under UK law" and "work with the internet industry to help them in their continuing efforts to identify extremist content to include in family-friendly filters" which would likely involve lobbying ISPs to add the CTIRU list to their filters without the need for additional legislation.

CTIRU hold responsibility for the implementation of aspects of the Counter-Terrorism and Security Act 2015, and are the custodians of the CTIRU list - a continuously updated proscribed list of websites that is considered under the act to be illegal to access or attempt to access. The CTIRU list details URLs that, for one reason or another, cannot or will not be removed from ISP's or search engines. The list is one of the strategies employed by the Government as part of its drive to implement the "Prevent" legislation. As of September 2016, all Schools, childcare facilities or organisations that provide care or facilities for children under the age of 18 in the UK, have a statutory duty to ensure their systems cannot be used to access any of these websites by using firewall technology or service providers that are members of the IWF (Internet Watch Foundation) and ensuring that the technology prevents access to sites featured on the CTIRU list.

CTIRU makes an assessment of the legality of material before referring it on to platforms to consider it for removal:

"I underline the fact that any online activity by the three groups under consideration, including Facebook pages and Twitter accounts, has been referred to CTIRU. If it is assessed as illegal — there is a legal test that has to be met — CTIRU will flag it directly to Facebook and Twitter for removal." [3]

CTIRU appear to assess content on the basis of UK terror legislation:

"All referrals are assessed by CTIRU against UK terrorism legislation (Terrorism Act 2000 and 2006). Those that breach this legislation are referred to industry for removal. If industry agrees that it breaches their terms and conditions, they remove it voluntarily." [4]

Such a notice and assessment would provide "actual knowledge" of criminality by a platform, meaning that a platform would no longer be able to rely upon the defence available as part of the E-Commerce Directive that they are an intermediary "hosting" content without an awareness of it. Therefore, although a notice sent by CTIRU to a platform - requesting the removal of content - may request compliance on a voluntary basis, such a notice may also act to ensure that a platform is able to be considered as a "publisher" of the content for the purposes of future legal action if the notice is not complied with.

See also

Related Research Articles

The Australian Communications and Media Authority (ACMA) is an Australian Government statutory authority within the Communications portfolio. ACMA was formed on 1 July 2005 with the merger of the Australian Broadcasting Authority and the Australian Communications Authority.

Internet censorship in Australia is enforced by both the country's criminal law as well as voluntarily enacted by internet service providers. The Australian Communications and Media Authority (ACMA) has the power to enforce content restrictions on Internet content hosted within Australia, and maintain a blocklist of overseas websites which is then provided for use in filtering software. The restrictions focus primarily on child pornography, sexual violence, and other illegal activities, compiled as a result of a consumer complaints process.

Prevention of Terrorism Act 2005 United Kingdom legislation

The Prevention of Terrorism Act 2005 was an Act of the Parliament of the United Kingdom, intended to deal with the Law Lords' ruling of 16 December 2004 that the detention without trial of eight foreigners at HM Prison Belmarsh under Part 4 of the Anti-terrorism, Crime and Security Act 2001 was unlawful, being incompatible with European human rights laws.

The Association of Chief Police Officers of England, Wales and Northern Ireland (ACPO) was a not-for-profit private limited company that for many years led the development of policing practices in England, Wales, and Northern Ireland. Established in 1948, ACPO provided a forum for chief police officers to share ideas and coordinate their strategic operational responses, and advised government in matters such as terrorist attacks and civil emergencies. ACPO coordinated national police operations, major investigations, cross-border policing, and joint law enforcement. ACPO designated Senior Investigative Officers for major investigations and appointed officers to head ACPO units specialising in various areas of policing and crime reduction.

Due to the international nature of the Internet, the legal status of Internet pornography carries with it special issues with regard to the law. There is no one set of laws that apply to the distribution, purchase, or possession of Internet pornography. This means that, for example, even if a pornographer is legally distributing pornography, the person receiving it may not be legally doing so due to local laws.

The Internet Community Ports Act (ICPA), created and advocated by The CP80 Foundation, is an approach to HTTP content filtering, leveraging TCP ports to segregate content between "Community Ports" and "Open Ports." "Community Ports" would be for content that is not considered "obscene or harmful" to children; "Open Ports" would be for all other content, including content only considered suitable for adults, such as pornography. According to the advocates of ICPA, this would enable an individual to, through their ISP's firewall, choose the content they want by port, allowing content or blocking out content individually. Advocates of ICPA would particularly like to remove "objectionable" content from Port 80, the standard port for World-Wide Web traffic.

The Internet in Japan provides high quality services to more than 90 percent of the population and almost 100% of medium to large businesses. The use of smartphones to access the Internet is increasing rapidly with roughly equal numbers of users using computers and smartphones to access the Internet in 2015. The Ministry of Internal Affairs and Communications (MIC) oversees the telecommunications, Internet, and broadcast sectors, but regulation of Japan's Internet industry is largely through voluntary self-regulation. While there is little or no overt censorship or restriction of Internet content, there are concerns that the government indirectly encourages self-censorship practices.

Internet censorship in the United Kingdom is conducted under a variety of laws, judicial processes, administrative regulations and voluntary arrangements. It is achieved by blocking access to sites as well as the use of laws that criminalise publication or possession of certain types of material. These include English defamation law, the Copyright law of the United Kingdom, regulations against incitement to terrorism and child pornography.

Electronic Commerce Directive 2000 Directive of the European Parliament

The e-Commerce Directive, adopted in 2000, sets up an Internal Market framework for online services. Its aim is to remove obstacles to cross-border online services in the EU internal market and provide legal certainty for businesses and consumers. It establishes harmonized rules on issues such as the transparency and information requirements for online service providers; commercial communications; and electronic contracts and limitations of liability of intermediary service providers. Finally, the Directive encourages the drawing up of voluntary codes of conduct and includes articles to enhance cooperation between Member States. Twenty years after its passage, there is wide-ranging discussion, especially in the European Parliament, about how to revise this directive in anticipation of the Digital Services Act.

Online Copyright Infringement Liability Limitation Act part of the Digital Millenium Copyright Act, a law in the United States, granting conditional safe harbor

The Online Copyright Infringement Liability Limitation Act (OCILLA) is United States federal law that creates a conditional 'safe harbor' for online service providers (OSP) by shielding them for their own acts of direct copyright infringement as well as shielding them from potential secondary liability for the infringing acts of others. OCILLA was passed as a part of the 1998 Digital Millennium Copyright Act (DMCA) and is sometimes referred to as the "Safe Harbor" provision or as "DMCA 512" because it added Section 512 to Title 17 of the United States Code. By exempting Internet intermediaries from copyright infringement liability provided they follow certain rules, OCILLA attempts to strike a balance between the competing interests of copyright owners and digital users.

Internet Watch Foundation organization

The Internet Watch Foundation (IWF) is a registered charity based in Cambridgeshire, England. It states that its remit is "to minimise the availability of online sexual abuse content, specifically child sexual abuse images and videos hosted anywhere in the world and non-photographic child sexual abuse images hosted in the UK." Content inciting racial hatred was removed from the IWF's remit after a police website was set up for the purpose in April 2011. The IWF used to also take reports of criminally obscene adult content hosted in the UK. This was removed from the IWF’s remit in 2017. As part of its function, the IWF says that it will "supply partners with an accurate and current URL list to enable blocking of child sexual abuse content". It has "an excellent and responsive national Hotline reporting service" for receiving reports from the public. In addition to receiving referrals from the public, its agents also proactively search the open web and deep web to identify child sexual abuse images and videos. It can then ask service providers to take down the websites containing the images or to block them if they fall outside UK jurisdiction.

The Office for Security and Counter-Terrorism (OSCT) is an executive directorate of the UK government Home Office, created in 2007, responsible for leading the work on counter-terrorism in the UK, working closely with the police and security services. The office reports to the Home Secretary and Minister of State for Security and Counter-Terrorism. Its current Director General is Tom Hurd, who is described as the senior government official responsible for counter terrorist and organised crime strategy.

Internet censorship in New Zealand refers to the Government of New Zealand's system for filtering website traffic to prevent Internet users from accessing certain selected sites and material. While there are many types of objectionable content under New Zealand law, the filter specifically targets content depicting the sexual abuse or exploitation of children and young persons. The Department of Internal Affairs runs the filtering system, dubbed the Digital Child Exploitation Filtering System (DCEFS). It is voluntary for Internet Service Providers (ISPs) to join.

Notice and take down is a process operated by online hosts in response to court orders or allegations that content is illegal. Content is removed by the host following notice. Notice and take down is widely operated in relation to copyright infringement, as well as for libel and other illegal content. In United States and European Union law, notice and takedown is mandated as part of limited liability, or safe harbour, provisions for online hosts. As a condition for limited liability online hosts must expeditiously remove or disable access to content they host when they are notified of the alleged illegality.

File sharing in the United Kingdom relates to the distribution of digital media in that country. In 2010, there were over 18.3 million households connected to the Internet in the United Kingdom, with 63% of these having a broadband connection. There are also many public Internet access points such as public libraries and Internet cafes.

There is medium internet censorship in France, including limited filtering of child pornography, laws against websites that promote terrorism or racial hatred, and attempts to protect copyright. The "Freedom on the Net" report by Freedom House has consistently listed France as a country with Internet freedom. Its global ranking was 6 in 2013 and 12 in 2017. A sharp decline in its score, second only to Libya was noted in 2015 and attributed to "problematic policies adopted in the aftermath of the Charlie Hebdo terrorist attack, such as restrictions on content that could be seen as ‘apology for terrorism,’ prosecutions of users, and significantly increased surveillance."

The Clean IT Project is an online project initiated by the European Union, aiming to reduce or discourage online terrorism and further illegal activities via the internet. They aim to create a document that commits the internet industry to help governments discover content that incites acts of terrorism. The main facilitators that undertook this project were the Netherlands, Germany, United Kingdom, Belgium, and Spain. There are many more supporting EU members such as Hungary, Romania, and recently, Italy, but the main countries that have started the project are the 5 listed above.

The Draft Communications Data Bill was draft legislation proposed by then Home Secretary Theresa May in the United Kingdom which would require Internet service providers and mobile phone companies to maintain records of each user's internet browsing activity, email correspondence, voice calls, internet gaming, and mobile phone messaging services and store the records for 12 months. Retention of email and telephone contact data for this time is already required by the Data Retention Regulations 2014. The anticipated cost was £1.8 billion.

The precise number of websites blocked in the United Kingdom is unknown. Blocking techniques vary from one Internet service provider (ISP) to another with some sites or specific URLs blocked by some ISPs and not others. Websites and services are blocked using a combination of data feeds from private content-control technology companies, government agencies, NGOs, court orders in conjunction with the service administrators who may or may not have the power to unblock, additionally block, appeal or recategorise blocked content.

The child abuse image content URL list is a list of URLs and image hashes provided by the Internet Watch Foundation to its partners to enable the blocking of child pornography & criminally obscene adult content in the UK and by major international technology companies.

References

  1. "Islam: Tenets Question". theyworkforyou.com. 7 December 2017. Retrieved 19 April 2018.
  2. "Tackling extremism in the UK: report by the Extremism Taskforce". GOV.UK. 4 December 2013. Retrieved 16 November 2015.
  3. "House of Commons Hansard Debates for 02 Apr 2014 (pt 0003)". Parliament.uk. 2 April 2014. Retrieved 19 April 2018.
  4. "Counter-terrorism:Written question - 30893". Parliament.uk. 14 March 2016. Retrieved 19 April 2018.