European Union regulation | |
Text with EEA relevance | |
Title | Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse |
---|---|
Journal reference | |
Preparative texts | |
Commission proposal | COM/2022/209 final |
Proposed |
The Regulation to Prevent and Combat Child Sexual Abuse (Child Sexual Abuse Regulation, or CSAR) is a European Union regulation proposed by the European Commissioner for Home Affairs Ylva Johansson on 11 May 2022. The stated aim of the legislation is to prevent child sexual abuse online through the implementation of a number of measures, including the establishment of a framework that would make the detection and reporting of child sexual abuse material (CSAM) by digital platforms –known by its critics as Chat Control –a legal requirement within the European Union. [1] [2]
The ePrivacy Directive is an EU directive concerning digital privacy. In 2021, the EU passed a temporary derogation to it –called Chat Control 1.0 by critics –which allowed email and communication providers to search messages for presence of CSAM. [3] [4] It was not mandatory and did not affect end-to-end encrypted messages. The purpose of CSAR –called Chat Control 2.0 by critics –is to make it mandatory for service providers to scan messages for CSAM, and to bypass end-to-end encryption. [3]
Supporters of the regulation include dozens of campaign groups, [5] activists and MEPs, along with departments within the European Commission and European Parliament themselves. Opponents include civil society organisations and privacy rights activists. [6]
The European Commission's Directorate-General for Migration and Home Affairs argues that voluntary actions by online service providers to detect online child sexual abuse are insufficient. They emphasize that some service providers are less involved in combating such abuse, leading to gaps where abuse can go undetected. Moreover, they highlight that companies can change their policies, making it challenging for authorities to prevent and combat child sexual abuse effectively. The EU currently relies on other countries, primarily the United States, to launch investigations into abuse occurring within the EU, resulting in delays and inefficiencies. [7]
Several bodies within the EU claim the establishment of a centralized organization, the EU Centre on Child Sexual Abuse, would create a single point of contact for receiving reports of child sexual abuse. [7] [1] It is claimed this centralization would streamline the process by eliminating the need to send reports to multiple entities and would enable more efficient allocation of resources for investigation and response. [7]
Proponents also argue for the need to improve the transparency of the process of finding, reporting, and removing online child sexual abuse material. They claim that there is currently limited oversight of voluntary efforts in this regard. The EU Centre would collect data for transparency reports, provide clear information about the use of tools, and support audits of data and processes. It aims to prevent the unintended removal of legitimate content and address concerns about potential abuse or misuse of search tools. [7]
Another aspect highlighted by supporters is the necessity for improved cooperation between online service providers, civil society organizations, and public authorities. The EU Centre is envisioned as a facilitator, enhancing communication efficiency between service providers and EU countries. By minimizing the risk of data leaks, the Centre aims to ensure the secure exchange of sensitive information. This cooperation is crucial for sharing best practices, information, and research across different countries, thereby strengthening prevention efforts and victim support. [7]
Groups opposed to this proposal often highlight that it would impose mandatory chat control for all digital private communications, and as such commonly refer to the proposed legislation by the name "Chat Control". [8] [9] [10] Civil society organisations and activists have argued that the proposal is not compatible with fundamental rights, infringing on the right to privacy. [11] [12] Moreover, the proposal has been criticised as technically infeasible. In Ireland, only 20.3% of the reports received by the Irish police forces turned out to be actual exploitation material. Specifically, from a total of 4192 reports received, 471 i.e. more than 10% were false positives. [13]
The European Parliament commissioned an additional impact assessment on the proposed regulation which was presented in the Committee on Civil Liberties, Justice, and Home Affairs. [14] The European Parliament's study heavily critiqued the Commission's proposal. According to the Parliament's study, there aren't currently any technological solutions that can detect child sexual abuse material, without resulting in a high error rate which would affect all messages, files and data in a particular platform. [15] In addition, the European Parliament's study concluded that the proposal would undermine end-to-end encryption and the security of digital communications. Lastly, the study highlighted that the proposed regulation would make teenagers "feel uncomfortable when consensually shared images could be classified as CSAM". [15]
The Council of the European Union's Legal Service also criticised the impact of the Commission's proposal on the right to privacy. The Council's legal opinion emphasized that the screening of interpersonal communications of all citizens affects the fundamental right to respect for private life as well as the right to the protection of personal data. [16] The legal experts of the Council also referenced the jurisprudence of the EU Court of Justice, which has ruled out against generalised data retention. [17]
The European Data Protection Supervisor (EDPS) together with the European Data Protection Board (EDPB) stated, in a joint opinion, that "the Proposal could become the basis for de facto generalized and indiscriminate scanning of the content of virtually all types of electronic communications", which could have chilling effects on sharing legal content. [18]
In March 2023, introduced a revised version of the proposal, which Germany's Digital Affairs Committee noted drew strong opposition from several groups. The new scheme, referred to as "Chat Control 2.0", proposed to implement scanning on encrypted communications. [19] In April 2023, the European Parliament confirmed that they had received messages calling to vote against the European Commission's chat control proposal. [20] Citizens expressed their concerns that the new legislation would breach data protection and privacy rights.
EU Commissioner Ylva Johansson has also been heavily criticised regarding the process in which the proposal was drafted and promoted. A transnational investigation by European media outlets revealed the close involvement of foreign technology and law enforcement lobbyists in the preparation of the proposal. [21] This was also highlighted by digital rights organisations, which Johansson rejected to meet on three occasions. [22] Commissioner Johansson was also criticised for the use of micro-targeting techniques to promote its controversial draft proposal, which violated the EU's data protection and privacy rules. [23]
On November 14 2023, the European Parliament's Committee on Civil Liberties, Justice, and Home Affairs (LIBE), voted to remove indiscriminate chat control and allow for the targeted surveillance of specific individual and groups which are reasonably suspicious. Moreover, Members of the European Parliament voted in favour of the protection of encrypted communications. [24]
In February 2024, the European Court of Human Rights ruled, in an unrelated case, that requiring degraded end-to-end encryption "cannot be regarded as necessary in a democratic society". This underlined the European Parliament's decision to protect encrypted communications. [25]
In May 2024, Patrick Breyer reported that moves were again being made to restore indiscriminate message scanning to the legislation, under the name of "upload moderation". [26]
On 21 June, it was reported that voting on the legislation had been temporarily withdrawn by the EU Council, in a move that is believed to be the result of pushback by critics of the proposal including software vendors. [27] [28]
Internet privacy involves the right or mandate of personal privacy concerning the storage, re-purposing, provision to third parties, and display of information pertaining to oneself via the Internet. Internet privacy is a subset of data privacy. Privacy concerns have been articulated from the beginnings of large-scale computer sharing and especially relate to mass surveillance.
Ylva Julia Margareta Johansson is a Swedish politician who has been serving as European Commissioner for Home Affairs and Sweden's European Commissioner in the von der Leyen Commission since 1 December 2019.
Information privacy, data privacy or data protection laws provide a legal framework on how to obtain, use and store data of natural persons. The various laws around the world describe the rights of natural persons to control who is using its data. This includes usually the right to get details on which data is stored, for what purpose and to request the deletion in case the purpose is not given anymore.
European Digital Rights is an international advocacy group headquartered in Brussels, Belgium. EDRi is a network collective of non-profit organizations (NGO), experts, advocates and academics working to defend and advance digital rights across the continent. As of October 2022, EDRi is made of more than 40 NGOs, as well as experts, advocates and academics from all across Europe.
The Four Horsemen of the Infocalypse refers to those who use the Internet to facilitate crime or (pejoratively) to rhetorical approaches evoking such criminals.
Axel Voss is a German lawyer and politician of the Christian Democratic Union of Germany who has been serving as a Member of the European Parliament since 2009 and became coordinator of the European People's Party group in the Committee on Legal Affairs in 2017. His parliamentary work focuses on digital and legal topics.
PhotoDNA is a proprietary image-identification and content filtering technology widely used by online service providers.
eIDAS is an EU regulation with the stated purpose of governing "electronic identification and trust services for electronic transactions". It passed in 2014 and its provisions came into effect between 2016 and 2018.
The ePrivacy Regulation (ePR) is a proposal for the regulation of various privacy-related topics, mostly in relation to electronic communications within the European Union. Its full name is "Regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC ." It would repeal the Privacy and Electronic Communications Directive 2002 and would be lex specialis to the General Data Protection Regulation. It would particularise and complement the latter in respect of privacy-related topics. Key fields of the proposed regulation are the confidentiality of communications, privacy controls through electronic consent and browsers, and cookies.
The migration and asylum policy of the European Union is within the area of freedom, security and justice, established to develop and harmonise principles and measures used by member countries of the European Union to regulate migration processes and to manage issues concerning asylum and refugee status in the European Union.
Deepfake pornography, or simply fake pornography, is a type of synthetic pornography that is created via altering already-existing photographs or video by applying deepfake technology to the images of the participants. The use of deepfake porn has sparked controversy because it involves the making and sharing of realistic videos featuring non-consenting individuals, typically female celebrities, and is sometimes used for revenge porn. Efforts are being made to combat these ethical concerns through legislation and technology-based solutions.
Regulation of artificial intelligence is the development of public sector policies and laws for promoting and regulating artificial intelligence (AI). It is part of the broader regulation of algorithms. The regulatory and policy landscape for AI is an emerging issue in jurisdictions worldwide, including for international organizations without direct enforcement power like the IEEE or the OECD.
The Digital Services Act (DSA) is an EU regulation adopted in 2022 that addresses illegal content, transparent advertising and disinformation. It updates the Electronic Commerce Directive 2000 in EU law, and was proposed alongside the Digital Markets Act (DMA).
The Digital Markets Act (DMA) is an EU regulation that aims to make the digital economy fairer and more contestable. The regulation entered into force on 1 November 2022 and became applicable, for the most part, on 2 May 2023.
Alin Cristian Mituța is a Romanian politician of REPER who has been serving as Member of the European Parliament since 2020.
The Data Governance Act (DGA) is a regulation by the European Union that aims to create a framework which will facilitate data-sharing. The proposal was first announced within the 2020 European strategy for data and was officially presented by Margrethe Vestager in 25 November 2020. The DGA covers the data of public bodies, private companies, and citizens. Its main aims are to safely enable the sharing of sensitive data held by public bodies, to regulate data sharing by private actors. On 30 November 2021, the EU Parliament and Council reached an agreement on the wording of the DGA. The formal approval by those bodies was competed by the 30.05.2022.
The Data Act is a European Union regulation which aims to facilitate and promote the exchange and use of data within the European Economic Area.
The Artificial Intelligence Act is a European Union regulation concerning artificial intelligence (AI). It establishes a common regulatory and legal framework for AI within the European Union (EU). It came into force on 1 August 2024, with provisions coming into operation gradually over the following 6 to 36 months.
The Trade and Technology Council (TTC) is a transatlantic political body which serves as a diplomatic forum to coordinate technology and trade policy between the United States and European Union. It is composed of ten working groups, each focusing on specific policy areas. The formation of the TTC was first announced by US President Joe Biden and the European Commission President Ursula von der Leyen on June 15, 2021. The early agenda focused primarily on US-EU cooperation in technology, strategic sectors, market access, trade, democratic values and rule of law in the digital world, supply chain resilience, the global trade order and the EU's developing regulatory agenda like Digital Services Act, Data Act and Cloud Rules. The TTC was established under the leadership of five co-chairs – European Commission Executive Vice-President Margrethe Vestager, European Commission Executive Vice-President Valdis Dombrovskis, US Secretary of State Antony Blinken, US Secretary of Commerce Gina Raimondo, and US Trade Representative Katherine Tai.
The EU–US Data Privacy Framework is a European Union–United States data transfer framework that was agreed to in 2022 and declared adequate by the European Commission in 2023. Previous such regimes—the EU–US Privacy Shield (2016–2020) and the International Safe Harbor Privacy Principles (2000–2015)—were declared invalid by the European Court of Justice in part due to concerns that personal data leaving EU borders is subject to sweeping US government surveillance. The EU-US Data Privacy Framework is intended to address these concerns.