Chat control: Tech giants want to continue scanning despite expired EU rules

Legal basis for indiscriminate scanning of child abuse material expired. Google, Meta, Microsoft & Co. continue controversial practice.

listen Print view
A smartphone with an open messaging app in a human hand, a chat is blurred on the screen.

(Image: Tero Vesalainen/Shutterstock.com)

5 min. read
Contents

The political tug-of-war in Brussels over voluntary chat control has reached a preliminary end with far-reaching consequences: as of this weekend, the legal basis for the indiscriminate scanning of private communication in the EU is missing. A corresponding transitional regulation, which allowed providers such as Meta, Google or Microsoft to proactively scan messenger services and emails for depictions of sexual child abuse (CSAM), has expired. The EU Commission and security authorities are now lamenting a protection gap. Civil rights activists, on the other hand, see an opportunity to place child protection on a legally sound foundation.

The reactions to the expiry of the regulation are vehement. EU Home Affairs Commissioner Magnus Brunner called the extension that failed in parliament “difficult to comprehend.” Child protection organizations such as the Internet Watch Foundation spoke of an “egregious political failure.” According to experts, without the exception regulation, automated scans of private messages violate the current ePrivacy Directive. A Commission spokesperson stated: “Without a legal basis, companies are no longer permitted to proactively detect child sexual abuse in private communication.”

However, the major tech corporations are not considering an immediate halt to their surveillance measures. Google, Meta, Microsoft, and Snapchat's parent company, Snap, announced on Saturday in a joint statement that they will “continue to take voluntary measures” to identify such material on their platforms. They warn: In general, the change in law poses “the risk that children worldwide will be less protected from the most heinous harms.”

Videos by heise

The quartet is simultaneously calling on the EU institutions to “urgently conclude negotiations on a regulatory framework.” An industry insider pointed out to Politico that while the legal situation has become “cloudy.” However, this does not necessarily make scanning illegal. This assessment is in direct contradiction to the view not only of the Commission. Before the exception provision came into effect, Meta subsidiary Facebook suspended the scanning of communication at the end of 2020.

Resistance to the continuation of the practice, which has been fiercely debated for years, comes from, for example, the Pirate Party and its former MEP Patrick Breyer. They argue that the existing system has primarily produced a “false sense of security.” According to figures from the Federal Criminal Police Office (BKA), almost half of the suspicious reports triggered by US corporations were irrelevant to criminal prosecution. Furthermore, 99 percent of Meta's reports concern already known material, the detection of which does not stop ongoing abuse. Thus, authorities are merely overloaded with duplicates.

“The end of indiscriminate chat control is not a setback, but an opportunity for real child protection,” Breyer emphasized. He compares mass surveillance to trying to mop the floor while the faucet is still running. Instead, the focus must be on the source of the evil.

Together with Pirate Party chairwoman Lilia Kayra Kuyumcu, Breyer has presented an action plan that aims to shift the focus towards targeted action. One point is the principle of "Delete instead of look away.” The freed-up capacities at the BKA, which were previously tied up by the flood of irrelevant reports, are to be used to actively track down and have child abuse material in darknet forums removed. So far, law enforcement agencies often leave such findings online.

Further key points: Apps should be pre-configured so that external contacts and the sharing of personal data are made more difficult by default (Safety by Design). For investigators, targeted surveillance measures ordered by a judge against specific suspects are the best instrument. A “classroom set for digital self-defense” is also intended to help students recognize early attempts by potential perpetrators to approach them online (grooming).

Those who have themselves been subjected to sexualized violence also welcome the end of indiscriminate scans. IT expert Alexander Hanff emphasizes how important encrypted and private communication is for victims: “We survivors need privacy, because without it we lose our voice.” He accuses the Commission of having invested millions in algorithms that do not effectively protect children. Direct help for victims, on the other hand, is lacking.

In parallel, discussions about a permanent “Chat Control 2.0” are ongoing in Brussels, which could last for months. Until then, the legal situation for tech giants remains precarious. Should they continue their scans without explicit permission, they face not only fines for data protection violations but also new landmark rulings. As early as 2021, former ECJ judge Ninon Colneric stated in an opinion that such interventions disproportionately violate the fundamental rights to privacy and freedom of expression. The debate shows: The fight against child abuse is undisputed. However, Europe remains divided on how to achieve it.

(nie)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.