Olympics: Paris police prefecture approves AI-supported video surveillance
Algorithm-controlled video surveillance on an experimental basis is now officially permitted in the metro for the Summer Games. Civil rights activists protest.
When the official starting signal for the Summer Olympics is given on Friday, law enforcement officers will now also be keeping an eye on sports enthusiasts, travelers, locals and tourists on public transport with specially equipped electronic eyes. In a decree issued on July 19, the Paris police prefecture approved the experimental use of algorithm-controlled video surveillance instruments in 46 Paris metro stations. This is in line with the law for the Olympic Games of May 2023, which has already been constitutionally reviewed and sets out the legal framework for the pilot project initially approved until the end of March 2025. This allows the widespread use of cameras for "intelligent" video surveillance in real time.
With this approach, the French government wants to enable security authorities to detect suspicious behavior, unattended luggage, people who have fallen, fires, weapons and threatening crowds. The tools used do not enable biometric facial recognition, the Ministry of the Interior emphasizes. The algorithms and the associated artificial intelligence (AI) functions are only trained to recognize eight high-risk situations. The French security authorities have tested the technology in recent months with companies such as Videtics, Orange Business, ChapsVision and Wintics, for example at concerts by Depeche Mode and Taylor Swift and at the Cannes Film Festival. They feel ready for the large-scale use of video surveillance, abbreviated to VSA in French, during the Olympics.
"We are transforming cameras into a powerful surveillance tool," Matthias Houllier, co-founder of Wintics, told Wired magazine. With thousands of electronic eyes, it is impossible for police officers to react directly to each of them. The system therefore analyzes "anonymous forms in public spaces". The algorithms could, for example, count the number of people in a crowd and people falling to the ground and alert the operators as soon as a certain threshold is exceeded. There is no automatic decision. For Houllier, the approach is a data protection-friendly alternative to controversial facial recognition systems, such as those used at the 2022 FIFA World Cup in Qatar.
Civil rights activists are up in arms against the VSA
According to French Interior Minister GĂ©rald Darmanin, vigilance is required: France is facing the "biggest security challenge any country has ever had to face in peacetime". Civil society organizations such as European Digital Rights (EDRi), La Quadrature du Net (LQDN), Amnesty International, AlgorithmWatch and Privacy International have long been up in arms against the initiative. They fear that the surveillance measures violate international human rights regulations such as the EU Charter of Fundamental Rights. Privacy, freedom of assembly and the principle of non-discrimination are at risk. The systems would also inevitably have to record and evaluate people's biometric characteristics and behavior, such as posture, gait, movements or appearance.
Read also
KI im Strafvollzug: Sachsen verzichtet zunächst auf Tools
KI-VideoĂĽberwachung in Londoner U-Bahn zur Erkennung von Straftaten in Echtzeit
Polizei will intelligente VideoĂĽberwachung nach Test ausbauen
Data protection officer clarifies conditions for video surveillance
NRW surveillance package puts data protection officer "on alert"
Together with other civil rights associations, LQDN has called for protests against the current measures as part of an anti-VSA campaign. In particular, the use of AI surveillance in metro stations, even far away from sports venues, "raises questions about the purpose", the organization's legal expert, Noémie Levain, told Euractiv. She wants to know: "Are the cameras installed in the République metro station, for example, being used to monitor people going to demonstrations?" Law professor Anne Toomey McKenna also warns in Telepolis of non-transparent legalized mass surveillance with a high potential for further data analysis and encroachment on fundamental rights. Critics do not believe that the experiment will actually end next spring.
(fds)