Digital Services Act: Civil Liberties Committee pushes for digital privacy and free speech
Today, the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs (LIBE) adopted its Opinion on the EU‘s upcoming Digital Services Act.[1] It is the first Committee to adopt a position on the proposed legislation. Rapporteur Patrick Breyer (Pirate Party) explains:
“We have come up with clear and sometimes ground-breaking answers to the monopolisation of the Internet by a few large corporations whose business model is based on spying on and manipulating users for advertising revenues, and which results in censoring and de-platforming users at will. We want the Digital Services Act to be a turning point for the protection of our fundamental rights in the digital era!”
Key proposals put forward by the Committee are:
Right to privacy
- Anonymous use of digital services: The Digital Services Act would provide for anonymous use of and payment for internet services wherever reasonably possible. This would prevent data scandals, identity theft, stalking and other forms of misuse of personal data, such as most recently as a result of the Facebook data breach that affected 500 million people. Government access to records of online activities would be permitted only for tackling serious crime and security threats. Indiscriminate data retention requirements would be banned.
- Contextual instead of surveillance-driven advertising: Behavioural and personalised targeting for non-commercial and political advertising would be phased out to protect users and ensure the existence of traditional media, and be replaced by contextual advertising. The same would apply to targeting people based on sensitive data, or to targeting minors. Behavioural and personalised targeting for commercial advertising would only be possible where users have freely opted in, without exposure to “dark” patterns or the risk of being excluded from services, and without being fatigued by consent banners if they have already made a clear choice in their browser/device settings.
- Right to encryption: Public authorities would not be allowed to restrict end-to-end encryption, as it is essential for online security.
Right to freedom of expression and information, media freedom
- No internet blocking: Internet access providers would no longer be obliged to block access to content. Illegal content would be deleted where it is hosted.
- Reign in error-prone upload filters: Online platforms would only exceptionally have the right to use error-prone algorithms for ex-ante control in order to temporarily block manifestly illegal and context-insensitive content, and subject to human review of every automated decision. Algorithms cannot reliably identify illegal content and currently routinely result in the suppression of legal content, including media content.
- Independent judiciary to decide: To protect freedom of expression and media freedom, the decision on the legality of content would be in the hands of the independent judiciary and not administrative authorities taking political orders, or corporations. Platforms would be allowed to interfere with the free exchange of lawful information only where it is incompatible with the declared purpose of their service.
- No foreign removal orders: Content published legally in one country would not have to be deleted because it violates the laws of an EU country. This protects against “illiberal democracies” like Hungary and Poland taking down content published elsewhere. The effect of cross-border orders would be limited to the territory of the issuing Member State.
- No mandatory account suspension: Providers would not be forced to suspend users for posting allegedly illegal content, as such an obligation would circumvent the sanctions laid down in law and the requirement of a court decision.
- Better making of soft law: The Committee wants to boost civil society participation, transparency and accountability when so-called “soft law” (codes of conduct) is developed.
Tackling illegal and problematic content
- User control over content recommendations: Users would have to opt in to a personalisation of the information presented to them (e.g. in the timeline) and would also be able to switch off the platform algorithms for proposing content altogether. This will curb the profit-driven spread of problematic content such as misinformation, conspiracy theories or hate speech.
- Unsafe products: A special regime would apply to addressing traders unlawfully promoting or offering products or services in the Union.
- Protecting victims rights: Complaints procedures would be available also to notifiers, such as victims of crime, whose notification of allegedly illegal content has not been acted upon. Victims would also be able to apply for interlocutory injunctions to have illegal content removed swiftly.
Background:
All compromises proposed by rapporteur Patrick Breyer (Pirate Party) were approved, most of them in a “block vote” with 42:19:1 votes.[2]
In a separate vote, adding interoperability requirements for online platforms to the Digital Services Act in order to allow for cross-platform interaction failed to secure a majority of Committee members, with the EPP, S&D and Renew political groups voting against.
After the General Data Protection Regulation, the Digital Services Act (DSA) is considered to be the next major project for regulating digitalisation at EU level: The Act is to replace the e-Commerce Directive, which has been in place since 2000, and thus establish fundamental new rules for digital platforms. After three resolutions of the European Parliament, the EU Commission presented its legislative proposal in December 2020.
Read on:
Breyer’s website on the Digital Services Act negotiations: https://www.patrick-breyer.de/en/posts/dsa/
[1] Full text of LIBE Opinion adopted today: https://www.patrick-breyer.de/wp-content/uploads/2021/07/LIBE-DSA-Opinion-Final-clean-revised.docx
Comments