Automated facial recognition: data protectionists call for stricter regulation

While calls for AI-supported facial recognition to prevent crime are becoming ever louder, data protection experts are once again warning of the dangers.

Save to Pocket listen Print view
Surveillance camera

Is mass surveillance looming?

(Image: heise online / mack)

3 min. read

While calls for AI-supported facial recognition to prevent crime are becoming ever louder, data protection experts are once again warning against its widespread use. If the powers of law enforcement authorities are actually to be extended, and they can search the internet for images to compare with other images or video recordings, a strict legal framework is needed. "The use of facial recognition systems can be a very intensive encroachment on the fundamental rights of the persons concerned," concludes the Conference of Independent Data Protection Supervisory Authorities of the Federal and State Governments (DSK).

In the opinion of the DSK, the current regulations do not provide a sufficient basis for the use of such systems. How intensive the intervention is depends on the type of data evaluated, the technology used and the degree of automation. The "scope of the measure" is particularly important here. The question of how many people are wrongly monitored or recorded is also important. It is also problematic if such monitoring takes place secretly and if people are wrongly suspected due to errors.

The EU has banned certain applications in the AI Regulation – such as the analysis of faces without cause. "Insofar as the AI Regulation and constitutional law leave the national legislator room for maneuver and it considers the corresponding use to be mandatory, it must create specific, proportionate legal bases for the use of facial recognition systems," the resolution states (PDF).

The DSK also refers to guidelines from the European Data Protection Board (EDPB). Facial recognition technology may only be used in "strict compliance with the relevant legal framework and only in cases where the requirements of necessity and proportionality are demonstrably met".

The new Federal Data Protection Commissioner, Prof. Louisa Specht-Riemenschneider, also expressed concerns about the planned expansion of police powers, as reported by Handelsblatt. The controversial US provider Clearview, for example, which was recently punished by the Dutch data protection authority, should not be used here.

Similar alternatives – the police officer and CSU member of parliament Alfred Grob called them "Bundes-Vera" (regarding Palantir's "cross-procedural research and analysis platform" Vera) – are not yet available. She also referred to serious infringements of fundamental rights if large numbers of innocent people were to be included.

Bettina Gayk, the State Commissioner for Data Protection in North Rhine-Westphalia, also warned that the privacy of innocent people could be severely compromised if facial recognition is used extensively. Gayk emphasizes that it is important to maintain the balance between security and freedom in order to avoid a surveillance state.

Bavaria, meanwhile, is planning to introduce real-time facial recognition in the police force. Existing cameras in public places are to be used for this purpose. Critics warn that such measures jeopardize the presumption of innocence and do not take into account the systems' susceptibility to errors.

(mack)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.