Data analysis and criminal prosecution: Fairer procedures or risky temptation?

At the Federal Data Protection Commissioner's police symposium, participants explained wishes and limits of automated data analysis and artificial intelligence.

listen Print view
Artificial intelligence: algorithms are already making decisions all over Europe

(Image: whiteMocca/Shutterstock.com)

6 min. read
Contents

It is "not helpful", says Louisa Specht-Riemenschneider, if AI laws are rushed through parliament that have to be overturned by the Federal Constitutional Court in two years' time. The new Federal Data Protection Commissioner invited people to discuss the role of AI and data analysis in police work. There should be no doubt that legal requirements are being adhered to. There is a need for planned, legislative action by a level-headed and thorough legislator – a somewhat subtle hint from the independent commissioner to those responsible in the Bundestag and the federal government.

The former judge at the Federal Court of Justice, Christoph Krehl, described the problem from a judicial perspective. Possible applications for AI and algorithmic systems are rightly severely restricted in Germany under constitutional and European law. Judges must be able to decide, understand and explain how systems work. According to Krehl, the use of AI systems for assistance tasks is permissible under certain conditions if they can be explained.

Digitalization phenomena cannot be managed with more resources such as police officers or public prosecutors alone, explained Markus Hartmann, Senior Public Prosecutor at the North Rhine-Westphalia Cybercrime Central and Contact Point. The ZAC NRW is also responsible for data-intensive proceedings in cases of online abuse. The ultimate opponent of the public prosecutor's office is not the complexity of the individual case, says Hartmann. The problem is the volume and points of contact. Better analysis would lead to new leads for further cases. "We are choking on our own success", said Hartmann. Automation and AI are particularly necessary in the analysis of evidence. But AI could also at least provide support in highly standardized procedures such as shoplifting reports. He is in favor of making the developments available as open-source models to ensure verifiability, said Hartmann.

For the well-known lawyer GĂĽl Pinar, there is also a lot to be said for the use of automated methods. Proceedings take an extremely long time and criminal investigation departments often provide evidence during ongoing trials. This often makes pre-trial detention unnecessarily long. At the same time, it must be clear that strict rules must apply: "It must be possible for every accused person to understand what has actually been done?" She suggested, for example, a right to inspect files in software and search parameters.

Alexander Poitz, Deputy Federal Chairman of the German Police Union (GdP), explained just how big the delta is between desire and reality in police authorities. There is an indirect protection of offenders through a lack of digital possibilities, which begins with a lack of cell phones. The police should be able to monitor messengers and carry out online searches. In the meantime, the opponent is only acting as a service provider in the shadowy world of the underground economy without direct offender relationships. He is also not afraid if, for example, legal action is taken against a new version of data retention – with IPv6, for example, there would be so many addresses available that a system of comparable official identifiers would have to be considered.

Tobias Wiemann, head of the sub-division for legal and policy matters in the Public Security Department at the Federal Ministry of the Interior, who is responsible for the BKA law, explained why more powers are planned. There is enormous time pressure when it comes to averting danger, for example in cases of suspected terrorism. If a suspect is arrested, it can take hours, for example in the case of seized smartphones: "To this day, the data is not automatically compared, but transferred manually and queried individually." This often involves several hundred contacts. Relevant data must be quickly distinguished from irrelevant data based on police experience. The Constitutional Court has expressly deemed automated analyses to be legitimate. The planned changes to Section 16a of the BKA Act would address three areas: international terrorism, the BKA's security group and the central office function, where the BKA would act as a service provider for the other police authorities – within narrow limits.

Videos by heise

The planned retrospective biometric comparison with publicly accessible data from the Internet involves two types of data: voice analysis and image data. But what is publicly accessible data? "Everything on the internet, including the darknet, without you logging into closed user groups", explained Wiemann. The reason for the new regulation is the case of a long-hidden RAF terrorist. Journalists used image search engines on the Internet to track her down before prosecutors could. Data protection authorities consider these search engines, which processed the image data without consent, to be based on illegally collected data.

Federal Data Protection Commissioner Louisa Specht-Riemenschneider called on all parties involved to define at an early stage which guard rails would be absolutely necessary for automated processes in police work. And warned: "The digital public space is not only the space in which we ourselves make data accessible, but also the space that others make accessible." Differentiation is not possible here.

(mma)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.