German data protectionists push for final end to chat control

German data protection authorities warn against indiscriminate mass surveillance in the EU and demand the safeguarding of end-to-end encryption for messengers.

listen Print view
Close-up of a smartphone screen showing the icons of various social media apps.

(Image: Primakov / Shutterstock.com)

4 min. read
Contents

In the run-up to the 4th trilogue negotiation round between the EU Parliament, Council, and Commission, scheduled for May 11, the Conference of Independent Data Protection Authorities of the Federal and State Governments (Datenschutzkonferenz, DSK) is intensifying its tone. In a resolution now published, the experts appeal to the EU bodies and, in particular, to the federal government to definitively abandon the plans for chat control.

Behind the initiative, which is officially intended to combat the sexual abuse of children, lies in the view of data protectionists an infrastructure for indiscriminate mass surveillance. This would threaten the core of European fundamental rights.

The DSK's criticism primarily targets the disproportionate nature of the measures, which place millions of citizens under a general suspicion. The goal of child protection is not in question. According to the supervisory authorities, private communication via messengers must enjoy the same protection as the classic right to privacy of correspondence. Intervention should only occur if there is a concrete reason, initiated by the monitored person themselves.

The planned regulation, on the other hand, provides for broad disclosure orders. These could oblige platform operators to scan private messages across the board. According to the DSK, the attempt to circumvent end-to-end encryption is particularly alarming. This could be achieved, for example, through client-side scanning (CSS), where content is checked on the end device before encryption.

The debate is gaining momentum due to a new legal development: Since the beginning of April, the legal basis for the “voluntary,” indiscriminate scanning of private communication in the EU has been missing. A transitional regulation that allowed providers to proactively scan messengers and emails for depictions of sexualized violence against children (CSAM) has expired.

The EU Commission and security authorities have since lamented a protection gap. However, civil rights advocates see a historic opportunity. For them, the end of the interim regulation is the option to place child protection on a legally sound foundation without mass surveillance.

Google, Meta, Microsoft, and Snap have nevertheless announced that they intend to continue taking voluntary measures to identify such material on their platforms. At the same time, they urged EU institutions to urgently conclude negotiations on a permanent regulatory framework.

Videos by heise

The DSK doubts the effectiveness of the intended surveillance tools. Technical analyses show that detection methods can be circumvented by simple image manipulations. At the same time, harmless files can be deliberately altered in such a way that they are falsely reported as illegal material. This would bring innocent citizens into the crosshairs of investigators. These weaknesses lead to numerous false reports, which in turn tie up valuable investigative resources.

The controllers support this with figures from 2023, which illustrate the imbalance: Microsoft, for example, searched over 11.7 billion pieces of content worldwide, with only 0.00007 percent of the communications reviewed in Europe leading to a concrete suspicion. The error rates of the systems used were sometimes up to 20 percent.

The EU data protection authorities have also already stated that such instruments do not uphold the principles of necessity and proportionality. The burden of proof for the appropriateness of such a severe intervention lies with the legislators, but according to the DSK, this proof could not be provided over many years. More effective would be the consistent enforcement of existing laws such as the Digital Services Act (DSA), as well as an obligation for platforms to design their services to be child-safe from the ground up according to the “Safety by Design” principle. This must be accompanied by adequate funding for prevention, media literacy, and victim support.

(vbr)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.