Police data analysis: "The more invasive the technology, the higher the hurdles"
Excessive police data analysis jeopardizes fundamental rights. Why the GFF is taking legal action against the use of Palantir's analysis software in Bavaria.
(Image: Shutterstock.com/Who is Dan)
In the dispute over the use of data analysis software by the police, courts and civil society have repeatedly raised constitutional concerns. We spoke with Franziska Görlitz from the Society for Civil Rights (GFF), which recently filed a constitutional complaint against the use of Palantir's cross-procedural search and analysis platform in Bavaria. – A complaint has also been successfully lodged against Hessendata.
(Image:Â Bernhard Leitner (GFF))
The topic of police data analysis has been discussed for years. Meanwhile, manufacturers of European alternatives that use their tools in a constitutionally compliant manner are waiting in the wings. But is this even possible?
Theoretically yes, but not in practice in the way it is currently being handled. We are not taking legal action against the manufacturers themselves, but against the respective laws that enable the police to carry out automated data analysis. In Hesse, North Rhine-Westphalia and Bavaria, the legal basis does not meet the constitutional requirements. Legislators should set clear limits. For example, that not too much data should be fed into such systems, how errors can be avoided, how discrimination can be ruled out, and how effective control can be ensured. All of this is currently lacking.
In 2023, the Federal Constitutional Court ruled that data analysis can, in principle, be constitutional.
That's right. The court has basically affirmed that such a thing is possible. But at the same time, it has emphasized that the more comprehensive and complex an analysis tool is, the stricter the protective mechanisms must be. The more invasive the technology, the higher the hurdles. Especially when it comes to artificial intelligence, there is no clear case law yet, so much is still open.
Who is particularly likely to end up in such analyses?
What is particularly dangerous is that even trivial incidents can land you in these systems. If you witness a traffic accident or report a stolen bicycle, your data ends up in the police case management system. This data is then regularly fed into the analyses without any major hurdles. As a result, people who have never given cause for police action themselves can be recorded.
Videos by heise
People who have a lot of contact with others who frequently appear in police files are particularly affected. Think of social workers, for example, who help fans, or educators in facilities for difficult teenagers. Due to their profession, they are constantly in contact with people who have had multiple run-ins with the police. Although these people are not doing anything wrong, they are highly likely to end up in data analyses because of their work. The same applies to journalists and criminal defense attorneys who regularly speak to people who appear in investigations. It is precisely these groups, which fulfill an important social function, that are at risk of suddenly finding themselves caught in the net of analysis software.
Are your concerns also being heard in politics?
Yes, we are actively involved in the legislative process. Most recently in Baden-WĂĽrttemberg and Saxony-Anhalt, where we submitted statements. It is important to us not only to complain after the fact, but to point out problems in advance and demand possible corrections.
Is there now any legally sound basis for the use of such tools anywhere?
In our opinion, no. Current laws allow too much data to be included in the analysis under lax conditions, and too powerful tools are used in the process. At the same time, there is a lack of effective safeguards against errors and discrimination.
Are these weaknesses due to ignorance or rather to indifference?
One cannot speak of ignorance. Since the first ruling at the latest, the basic standards have been known to every legislator, even if details have not yet been clarified. The explanatory memoranda to the laws also refer to this explicitly. But instead of acting in a cautious manner and in accordance with fundamental rights, attempts are regularly made to go as close as possible to the outer limits. This leads to these limits being exceeded.
So why are purchases still made?
One example is Baden-WĂĽrttemberg, where it was argued that they had struck at very short notice "because of a price fixing period". This clearly shows the danger when a provider is presented as having no alternative. Although there is competition, the idea is that only this one provider can deliver. This creates a dependency that ultimately deprives the state of its negotiating power. And the longer you stick with such a specific software solution, the more difficult it becomes to break away.
Does it help that journalists sometimes have the systems demonstrated to them?
No. For one thing, you can never be sure whether the latest version will be shown or a slimmed-down version. Secondly, the data protection supervisory authorities have not yet tested the software in real operation in any of the federal states. The results of the only known review in Bavaria are secret, and it was a one-off, although the software receives regular updates. As such systems are constantly being changed and updated, this test cannot rule out the risk of data outflows, errors or manipulation.
What is your conclusion?
The Federal Constitutional Court has asserted that data analysis by the police can be legally permissible. But only under very strict conditions. Presently, these are not being met anywhere. It is particularly problematic that people who provide professional protection, education, and social work can get caught up in the maelstrom due to their proximity to people who are conspicuous to the police.
In reality, however, every person runs the risk of being recorded without authorization by the data analyses. Politicians massively underestimate this. Mass data analysis, especially by artificial intelligence, is prone to errors and leads to discriminatory results; they therefore pose a great danger to fundamental rights.
(mack)