EU Parliament’s Research Service confirms: Chat control violates fundamental rights
Today, the European Parliament’s Research Service (EPRS) presented a new study on the legality of the proposed Child Sexual Abuse / Chat Control Regulation to the European Parliament’s lead Committee on Home Affairs (LIBE). The legal experts conclude that “when weighing the fundamental rights affected by the measures of the CSA proposal, it can be established that the CSA proposal would violate Articles 7 and 8 of the Charter of Fundamental Rights with regard to users. This violation of the prohibition of general data retention and the prohibition of general surveillance obligations cannot be justified.” And also: „A detection order on the content of interpersonal data either on the device or the server will compromise the essence of the right to privacy under Article 7 CFR in the form of confidentiality of telecommunications. It constitutes a form of access on a generalised basis, pursuant to Schrems, where it involves an analysis of all communications going through the server.“
The experts made clear that an “increase in the number of reported contents does not necessarily lead to a corresponding increase in investigations and prosecutions leading to better protection of children. As long as the capacity of law enforcement agencies is limited to its current size, an increase in reports will make effective prosecution of depictions of abuse more difficult.”
In addition, the study finds: “It is undisputed that children need to be protected from becoming victims of child abuse and depictions of abuse online… but they also need to be able to enjoy the protection of fundamental rights as a basis for their development and transition into adulthood.” It warns: „With regards to adult users with no malicious intentions, chilling effects are likely to occur.“
In order to align the proposal with fundamental rights and make it court-proof, the experts recommend: „It should be noted that when the CSA proposal would address the above observations and would require detection orders to also be specific with regards to the group of individuals to be monitored, the detection of known material could be considered specific enough so as not to violate the prohibition of general monitoring obligations (for internet access services and hosting services) and would comply with communications secrecy (for interpersonal communication). Technically, it could be feasible to program detection technologies for known material to monitor only the exchanges of a particular type of group, thereby, preventing overly wide detection orders in terms of affected users. Such groups could for instance be members of a forum or chat group (where previously CSAM was exchanged).“
After the presentation of the study, critical questions on the proposal were voiced by Members of almost all political groups, including by Sven Simon (EPP), Paul Tang and Birgit Sippel (S&D), Moritz Körner (Renew), Patrick Breyer (Greens/EFA) and Swedish members Alice Kuhnke (Greens/EFA) and Charlie Weimers (ECR). The Commission representative was hashly criticised for admittedly not even having read the study.
Pirate Party MEP Patrick Breyer, shadow rapporteur (negotiator) for his group in the Civil Liberties Committee (LIBE) and long-time opponent of mass scanning of private communications, comments:
“The EU Parliament’s Scientific Service now confirms in crystal clear words what I and numerous human rights activists, law enforcement officials, legal experts, abuse victims and child protection organisations have been warning about for a long time: the proposed general, indiscriminate scanning of our private conversations and photos destroys the digital privacy of correspondence and violates our fundamental rights. A flood of mostly false suspicious activity reports would make effective investigations more difficult, criminalise children en masse and fail to bring the abusers and producers of such material to justice. According to this expertise, searching private communications for potential child sexual exploitation material, known or unknown, is legally feasible only if the search provisions are targeted and limited to persons presumably involved in such criminal activity.
I think negotiators understand that if we give in to the impulse and best intentions to do everything possible, but fail to respect the legal limits imposed by fundamental rights, detection provisions will be struck down by the Court of Justice altogether, and we’ll be left with nothing, and fail to achieve anything to better protect children and victims. This disaster must be avoided at all cost. No one is helping children with a regulation that will inevitably fail before the European Court of Justice.
What we really need instead of untargeted chat control and identification obligations for age verification is obliging law enforcement agencies to have known exploitation material removed from the internet, as well as Europe-wide standards for effective prevention measures, victim support and counselling, and for effective criminal investigations.”
Addition: The draft study has been published in the meantime.