Data protection amendment: Police & activists support banning facial recognition

There was a surprising consensus at a Bundestag hearing: biometric surveillance in public spaces should be banned. Scoring rules are controversial.

Save to Pocket listen Print view
Face and writing

(Image: LuckyStep/Shutterstock.com)

7 min. read
Contents

At a hearing in the Bundestag on thefederal government's draft bill for a first amendment to the Federal Data ProtectionAct (BDSG), one point that has not yet been included in the bill was given a lot of attention: a ban on biometric surveillance, for example through automated facial recognition in public spaces. Matthias Marx from the Chaos Computer Club (CCC) and Simone Ruf from the Gesellschaft fĂĽr Freiheitsrechte (GFF) called for this requirement to be included. Surprisingly, Eike Richter, constitutional law expert at the Hamburg Police Academy, also made it clear that he had no constitutional objections to such a ban. The legislator could even be obliged under constitutional law to include such a clause.

During the negotiations on the EU's new AI regulation, live facial recognition was considered a particularly hot topic. The EU Parliament initially called for a general ban on biometric mass surveillance, while the member states were against it. The final version stipulates that real-time identification should be possible "for a limited time and place", in particular for the targeted search for victims of kidnapping, human trafficking and sexual exploitation or to prevent "a specific and present terrorist threat".

The traffic light coalition agrees that it does not want to use remaining loopholes in the AI Act for biometric surveillance, but wants to limit instruments such as automated facial recognition. In Germany, such forms of remote identification for the police are actually "already prohibited as long as it is not permitted", explained Richter. Currently, however, this is apparently an "ineffective reservation". If this does not apply, an explicit ban should take precedence, considering the interference with the right to informational self-determination associated with the instrument. The police law expert cited the ban on torture as a parallel. The AI Regulation refers in particular to the regulation of technical systems. Nevertheless, a legal prohibition could "remain broader" and possibly also include the private sector.

Marx from the CCC referred to the secret surveillance technology PerIS, which was developed on behalf of a police department in Saxony and is now also being used by law enforcement agencies in other federal states . The police have also evaded democratic control. In Saxony, for example, PerIS only became more widely known following a parliamentary inquiry. "There is a lack of coercive means for the authorities," said Marx, also promoting the appeal by the Conference of Federal and State Data Protection Supervisory Authorities (DSK) to be able to impose fines on the public administration.

Biometric surveillance systems lead to ubiquitous recording in public spaces, the hacker pointed out. Every step is recorded and can be analyzed in detail thanks to unique physical identification features. However, anyone who feels monitored could decide not to take part in a demonstration, for example. This process must be stopped.

The BDSG is particularly well suited to anchoring a ban on biometric remote identification systems, added Ruf from the GFF. It already contains regulations for biometric data processing. It also applies to private companies, public bodies of the federal government and, provisionally, those of the federal states. In general, automated facial recognition is not non-discriminatory and "often identifies non-white people incorrectly". This also makes police work more difficult. In the event of unauthorized access to collected biometric data, the affected characteristic can no longer be changed.

Another point of contention: the government also wants to stipulate that data such as home address, name or details from social networks can no longer be used in the future to assess consumers' ability to pay using scoring. Probability values may only be created or used if the personal data included is "not processed for any other purposes". Johannes MĂĽller from the German Federation of Consumer Organizations (vzbv) describes it as "important that certain categories are not used". Otherwise, users could also identify their behavior on social media or their address itself as a target to increase their score.

The future Federal Data Protection Commissioner Louisa Specht-Riemenschneider praised the planned restrictions on automated decisions based on scoring by Schufa & Co. as a "balanced proposal". However, this will not be enough to get unbalanced scoring practices under control. "The elephant in the room" are payment service providers such as PayPal or Klarna, for whom the guidelines do not apply. The Bonn law professor therefore advised that Article 18 of the Consumer Protection Directive should also be implemented to include these players. Just like the current Federal Data Protection Commissioner Ulrich Kelber and the Hessian Data Protection Commissioner Alexander RoĂźnagel, Specht-Riemenschneider also spoke out in favor of deleting the planned restrictions on the right to information to protect trade and business secrets.

In contrast, Passau-based constitutional law expert Meinhard Schröder felt that the scoring bans were too broad. A "total ban on special categories" of personal information such as ethnic origin, biometric characteristics and health data is not covered by the General Data Protection Regulation (GDPR). The exclusion of social networks also goes too far: a Facebook page, for example, is "more open" than a normal website. Munich law professor Boris Paal also feared the threat of "misunderstandings and legal uncertainties" regarding the restrictions on automated decisions, which are necessary in principle but overshoot the mark.

Experts also expressed opposing views on the government's plan to institutionalize the DSK in the BDSG. Schröder recognized "no legislative competence of the federal government" here. Gregor Thüsing, director of a legal institute at the University of Bonn, on the other hand, encouraged the MPs to continue on the "good interim path" they had taken. However, the DSK reform had left "little more than white ointment". Without substantial regulations such as the establishment of a permanent office, it remains a placebo. However, a law is not necessary for this. Richter and Paal also argued that the DSK should be able to coordinate more coherently on data protection issues and become more organized.

As DSK chairman, RoĂźnagel took up the cudgels for the committee's demand that the support of an office should also be enshrined in law. This is the only way to become independent of a federal state "that may not want to participate". Ultimately, the aim is to further strengthen the existing voluntary cooperation and to transfer the knowledge of the annually changing chairs. Specht-Riemenschneider asked the parliamentarians to "never again complain about an inconsistent interpretation of data protection law" if the DSK is not strengthened.

(mki)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.