Police congress: AI as last chance to enforce the monopoly on the use of force

If users reported all online crime, the police and public prosecutor's office would be hopelessly overwhelmed. According to one expert, only AI can help.

listen Print view
Three men on one podium

Security experts discuss at the European Police Congress.

(Image: Stefan KRempl/heise online)

6 min. read
Contents

Thomas-Gabriel Rüdiger, cybercriminologist at the Brandenburg Police University, sees artificial intelligence (AI) as the “last chance” to enforce the state's monopoly on the use of force. Digital crime on the internet is “absolutely normal”, explained the institute director at the European Police Congress in Berlin on Tuesday. One in three people in Germany receive criminal spam and phishing emails every day. In total, there are three times more cases per day in this area alone than the total annual crime statistics.

Even minors report insults, conspiracy theories, hate messages, unwanted pornography and even sexual abuse that they repeatedly receive, explained RĂĽdiger. If those affected were to report all of this, which hardly ever happens currently, the security authorities would not be able to deal with it. A large proportion of online crimes are also foreign crimes, of which only 2 percent are currently solved. Without AI, it is therefore virtually impossible for the police to do anything about it.

However, the professor went on to point out that Germany currently lacks the necessary reporting and interaction options with law enforcement officers. Although there are a total of 16 internet police stations, they are nothing more than letterboxes. In contrast to a local police station, it is virtually impossible to work out the facts of a crime online. Moreover, there is no single point of contact. This means that an original task of the police is increasingly being outsourced to platform operators, online complaints offices and civil society.

For Rüdiger, however, one thing is certain: it “must be very easy to file an accurate report online”. If this were the case, the reporting rates would increase massively, he believes. However, the law enforcement authorities would not have the necessary personnel to deal with them. In the future, however, it will be possible to use AI to record the corresponding volumes of reports if this is legally secured. ChatGPT, for example, could already operate in voice mode in this sense. In general, AI can automate 40 to 50 percent of police work. For example, it could write investigation files and interrogation protocols.

Videos by heise

The subsequent problem for the public prosecutor's office would then be to process the mass of submissions, the lawyer knows. They would also have to rely on AI, and the courts would be the next step. There would be no choice but to use the technology “for digital mass crime”. At the same time, the professor also warns: “We need to invest massively in digital education.” Otherwise, there would be “social distortions”.

“We want more support from AI,” said Stephan Weh, District Chairman of the Berlin Police Union (GdP), picking up the thread. This could be used to analyze large amounts of data, biometric facial recognition, surveillance, or the prediction of crime patterns. Many colleagues are currently tied up with property protection alone, although video cameras could play a greater role here. However, the Senate Administration is currently working with employee representatives to draw up a framework agreement for the use of AI by investigators in the capital.

The Berlin police currently receive 900,000 reports per year, reported Weh. 200,000 of these were received via the Internet watch. Many of these reports are grouped together as “mass crimes”, but colleagues may still have to carry out extensive searches and analyze evidence. Here, AI could take over in part to better deploy human employees where there is already a criminal approach and evidence. Patrick Pongratz from database specialist Couchbase gave the example of the Federal Criminal Police Office's Insitu app. This allows traces to be recorded digitally at the crime scene.

“Cybercrime is currently driving us forward,” admitted Weh. AI, on the other hand, is only as good as the person “who feeds it with data”. As this data is often highly sensitive for the police, data protection must be directly embedded. In the end, it is up to humans to decide; AI can only support them.

“AI could commit crimes on its own,” Rüdiger also warned of the dangers of technology. The story of a Microsoft Twitter bot that denied the Holocaust on its very first day is already legendary. However, the principle of legality does not apply here, as an AI has no legal personality and no will of its own. The police would therefore not have to and could not prosecute such acts directly, the expert criticized. If humanoid robots were to be equipped with AI, they could commit bodily harm or even crimes. So far, the laws have probably not yet provided the right answers.

“Criminal law is still anthropocentric,” confirmed Oldenburg legal scholar Alexis von Kruedener. Current law therefore attempts to place responsibility on human actors such as manufacturers, programmers, operators, supervisors or “keepers”. The AI Act also tends to provide for a fundamental rights impact assessment, especially for high-risk systems, and prohibits certain applications such as remote biometric identification in real time. “There is no way around AI in police work either,” emphasized Stephan Hempel from Schwarz Digits Defense.

The big question is how the trinity of data protection, information security and digital sovereignty can be ensured. Florian Domin from Secunet wooed the law enforcement officers present with possibilities such as image manipulation and realistic ageing of wanted people in “cold cases” or the creation of legends using synthetic identities for undercover investigators. If the other side uses deepfakes, the technology also helps to better identify them, he said.

(olb)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.