Statement by the Child Protection Association (Der Kinderschutzbund Bundesverband e.V.) on the Public Hearing of the Digital Affairs Committee on “Chat Control”
Summary of statement found here: https://kinderschutzbund.de/wp-content/uploads/2023/02/Statement-for-the-public-hearing-on-chat-control-on-March-1-2023_DKSB.pdf
The proposal and its implications
The EU initiative sends a clear signal to all EU states to take stronger action against sexualised violence against children. We strongly welcome this. To implement this important goal, the directive proposes necessary and correct measures, but goes too far at crucial points.
- Scanning private communications in messenger services or e-mails without any reason is neither proportionate nor effective.
- An environment in which freedom of expression and confidential communication are taken for granted is an essential pillar of democracy and participation.
- Scanning without any reason criminalises children and young people even more often.
- Data protection and child protection should not be played off against each other, children’s rights need both: the right to physical integrity, but also the right to protected communication.
- Only if they can trust that they will not be constantly monitored can they develop the necessary trust in their guardians, teachers and friends that will help them seek help from trusted people when they need it.
- Could lead to millions of legal message exchanges being unfairly targeted by authorities.
- Chat control would create a surveillance structure that could be misused for other purposes.
- The proposal also threatens, for example, certain professions that are bound to secrecy.
- Technology that makes it possible to censor certain content even before it is sent or uploaded endangers people living in (partly) authoritarian countries who are politically active, journalists or people in the LGBTIQ+ communities. This affects children just as much, especially the most vulnerable children.
- It is not justifiable to abolish the confidential private communication guaranteed by the Basic Law by means of algorithms.
Sensible measures for children’s rights online and against cyber grooming:
- Effective age verification (in conformity with fundamental rights: without compulsory identification, without collecting biometric data, without interfering with encrypted communication), in both directions (hiding content from younger users; preventing older users from accessing content used by children).
- Security requirements and mandatory risk assessments for providers (hosting and social media platform providers)
- High-quality, sensitive moderation of chats
- Pattern analysis to detect groomers in order to block and/or report them
- Easily accessible reporting procedures for children and young people who need help
- Server side scanning of public platforms: Mandatory scanning of footage on platforms’ servers and filehosters (searching for known footage with hashes and new data with AI support)
- Establish a central authority (which, like NCMEC, collects data, develops strategies, supports new technical procedures, monitors companies and assists with risk assessments). This must be independent (especially from Europol) and work closely with child protection organisations.
- Invest more in research. Facts, data and figures are needed to put the broad discussion on a reliable basis.
- Strengthen investigative capacities
- Abuse material offered e.g. in closed groups, also on the darknet, could best be discovered by enabling investigators to “patrol” online more often. The legal possibilities to do so exist (e.g. offering artificially generated material as an “entry ticket”).
- Adequate funding for institutions that actively work for the protection of children
- Technical support through Quick-Freeze, governmental reporting centres on the net
- Hold providers accountable (track down/report/delete material, implement transparent protection concepts)
- Prevention and education (e.g. promotion of media literacy)
- Extend the exemption once again so that file hosters can scan their servers and report to the American non-governmental organisation NCMEC (permission is provided by an exception to a data protection regulation, the EU Privacy Directive). In this way, the BKA receives data from the NCMEC (National Center for Missing and Exploited Children) for further investigations.
- Pattern analysis for communication platforms (e.g. incident-related investigation if an account contacts other accounts that have reported abuse).
- Make platform operators more accountable. (In the area of monitoring interaction possibilities, in the case of offerings that are heavily frequented by children, behaviour that points to adult users, switching offerings to child-friendly as soon as their own systems assume that they are dealing with a child as a user).
- Offer an easily accessible explanation in plain language in addition to the imprint/general terms and conditions/privacy policy in order to explain the purpose and background of the website to children and to offer advice and help.
- In the case of providers of discounted “family accounts“, parents and their children would have to make the age statement (not changeable and possibly readable for apps for age verification)
- Focus on prevention, see also the EU Commission’s BIK (Better Internet for Kids) initiative.
- Deletion of illegal material instead of net blocks
Criticism:
- Rejection of the so-called “detection order” described as “chat control”. (At the end of an official and legal procedure to scan the communication of all customers of a provider for weeks and months).
- If service providers do not follow the requirements, the right of the customers is restricted (in the case of the Money Laundering Act, this would mean that the accounts of all customers are monitored if banks are negligent).
- The focus on a technical solution is too one-sided and remains blind to a problem that affects society as a whole.
- Trusting in technology that has the potential for mass surveillance is naïve and ignores the fundamental rights of all people.
- Trusting in a high hit rate of automated systems based on the information provided by the manufacturers is a mistake.
- The regulation focuses exclusively on the dissemination on the internet, but not on the actual production of sexualised violent depictions of children.
- Measures are inadequate to combat dissemination.
- The enormous amount of false reports hinders the investigation of the perpetrators.
- Depictions of sexualised violence by organised groups are hardly disseminated through the channels controlled by this law.
- Fundamental question of how an AI is supposed to distinguish between innocuous and sexualised communication. (Training with which data sets?)
- Technology cannot be a substitute but only a support for investigations.