Sprache ändern: English
Share:

Chat control: Leaked Commission paper EU mass surveillance plans

European Parliament Freedom, democracy and transparency Press releases

A newly-revealed Opinion of a European Commission review board about their own colleagues’ upcoming proposal for a ‘Legislation to effectively tackle child sexual abuse’ shows strong concerns with the legislative proposal. Leaked by French media outlet Contexte, and dated 15 February 2022, the Opinion confirms the fears EDRi and 39 other civil society groups recently raised about the proposal which could destroy the integrity of private online communications across the EU, and set a dangerous precedent for the world.

“Reservations”. “Significant shortcomings”. “Efficiency and proportionality […] not sufficiently demonstrated.” “Options […] are not presented in a sufficiently open, complete and balanced manner.”

It might sound like we are talking about an inquiry into a dodgy business deal or some sort of murky political scandal. But in fact, what the above sentences refer to is a newly-revealed Opinion of a European Commission review board about their own colleagues’ upcoming proposal for a ‘Legislation to effectively tackle child sexual abuse’. The proposal is currently scheduled to be published on 27 April 2022, although further delays to May are likely. The proposal focuses on curbing the online spread of child sexual abuse material.

MEP and civil rights activist Patrick Breyer (Pirate Party) comments:

“The Regulatory Scrutiny Committee exposes the abysses of chat control, namely the fact that the blanket mass surveillance of intimate communications and images violates our fundamental rights according to the European Court of Justice. The fact that the project was finally given the green light can only be explained by massive pressure from the very top. Only a public outcry against chat control can stop Ursula von der Leyen now!”

In meetings, the staff of Commissioner for Home Affairs Ylva Johansson, who leads the file, reassured EDRi that the new law would not contain requirements for generalised scanning, and further that it would not touch encryption. But the results of the ‘Regulatory Scrutiny Board’ (RSB) who conducted the internal review tell a very different story:

“The report [on the Legislation to effectively tackle child sexual abuse] is not sufficiently clear on how the options that include the detection of new child sexual abuse material or grooming would respect the [EU] prohibition of general monitoring obligations.”

“In view of the assertion […] about the limitations of available technologies that exist for the use in encrypted communications […] the report should be clearer about the practical feasibility of the policy options and provide reassurance about the effective application.”

“The report should clarify how the options that include an obligation to detect new child sexual abuse material or grooming would respect privacy requirements, in particular the prohibition of general monitoring obligations.”

It follows that the current draft of the legislation, prepared by Commissioner Johansson and her team in DG HOME, contains rules which would force online communications service providers to conduct the generalised monitoring of people’s private communications – even those that are encrypted. Furthermore, the opinion notes the illegality of general monitoring under EU law, meaning that if it goes forward, the proposed law could potentially be taken down by the Court of Justice.

Moreover, the opinion indicates that the draft law would also require this generalised monitoring to be done not just for material that has been assessed by authorities to ensure that it is unlawful, but also to search for “unknown” images as well as so-called evidence of “grooming” using notoriously unreliable AI-based tools. We’ve all seen pictures being automatically flagged on social media because an AI tool wrongly thought that the picture contained nudity, and have all suffered the frustration of an important email automatically going into your spam folder.

These consequences are bad enough – but now, imagine if the consequence is not just a lost e-mail, but rather a report to the police accusing you of disseminating illegal child sexual abuse material or grooming a child. The inevitable result of such technologies would be unthinkable for those that are wrongly accused.

Website on the chat control plans