NRW Data Protection Officer warns against data usage at any cost
NRW reports record data protection complaints. Laws are passed hastily, fundamental rights protection often falls by the wayside, says report.
(Image: PopTika / Shutterstock.com)
The State Commissioner for Data Protection and Information Freedom of North Rhine-Westphalia, Bettina Gayk, has presented her 31st activity report for the year 2025 – and strikes a clear tone. In the most populous state, complaints reached a historic high, while Gayk warns of a gradual erosion of fundamental rights due to AI euphoria and hasty legislation. “Data usage is on everyone's lips and the new synonym for progress. However, I want to warn against ignoring the dangers of unbridled data usage,” says Gayk.
Complaints explode – fines close to half a million euros
According to the activity report, a total of around 18,060 submissions reached the authority in 2025 – an increase of around 45 percent compared to the previous record high of 12,490 submissions in the previous year. The increase in individual data protection complaints is particularly striking: they rose from 7,539 to 12,592 cases, an increase of more than 67 percent.
For Gayk, the figures are also proof that data protection has reached the people: Citizens want their rights to be respected, and they value the work and mandate of the authority.
AI as a double-edged sword – criticism of NRW security laws
A central theme of the report is the use of artificial intelligence by security authorities. Gayk sharply criticizes the revised police law and the new North Rhine-Westphalia constitutional protection law. Both laws have insufficient regulations for the use and training of AI, which neither adequately consider the different impacts of the various AI applications nor the problems arising from the data basis.
Videos by heise
The State Commissioner clearly differentiates: it is less critical if AI is used to formulate linguistically understandable letters. However, if AI is used to determine the probability of potential criminal offenses or indications of unconstitutional tendencies, this can have a considerable impact on the privacy of all citizens. It is particularly problematic that both laws allow the use of official data sets for AI training, including personal data, if anonymization is likely to involve high effort. Gayk warns that large data sets are almost never up-to-date, and errors in the source data could lead to the prosecution of innocent people in the worst case.
NRW expands police powers despite constitutional objection
Gayk also refers to Palantir's data analysis software – known in NRW as DAR (Data Analysis and Research). The LDI fears that US authorities could gain access to police data via the Cloud Act and the Foreign Intelligence Surveillance Act.
Following a clear objection by the Federal Constitutional Court, North Rhine-Westphalia should have clarified its police law and tightened it in accordance with fundamental rights. Instead, the state government used the opportunity to significantly expand the police's powers for data analysis and the use of artificial intelligence – largely ignoring central points of criticism. Gayk warns that some regulations “do not sufficiently take into account the principle of proportionality.”
Highly intrusive searches of police data
The LDI NRW sees this as a massive intrusion: “Highly intrusive searches of police data” would be enabled – with simultaneously insufficient legal hurdles. It is particularly problematic that this effectively undermines the principle of purpose limitation: citizens would lose control over how their data, once collected, will be used in the future.
Although the law provides for anonymization, this can be omitted if it is “likely to involve high effort.” After all, AI systems are designed to recognize hidden connections in data – even when obvious identifiers have been removed. “Anonymization is therefore generally difficult,” the assessment states. The risk that individuals can be indirectly re-identified remains high – even for witnesses or victims whose data is stored in police systems.
Even with data storage, the law falls short of constitutional requirements. However, the new regulation in NRW does not sufficiently differentiate between different storage purposes. Data for threat prevention, documentation, and later use are regulated “in one go” – although they interfere with fundamental rights to different degrees. The consequence: storage periods that are too long, a lack of usage restrictions, and an overall unclear legal situation.
Next lawsuit likely
The data protection officer's objections are particularly clear on one point: her objections were completely ignored during the legislative process. “Unfortunately, the LDI NRW could not ascertain any engagement with its concerns,” the conclusion states.
Constitutional Protection: Web Crawling and Camera Access
Regarding the new constitutional protection law, Gayk criticizes the essentially unlimited web crawling, as well as the undefined use of data analysis tools using AI. The use of constitutional protection data for AI training, and newly established access options to private video surveillance systems. The statement of the LDI NRW is available as a parliamentary document 18/2863. Camera access in particular caused a stir. Expert Prof. Mark A. Zöller called the regulation “completely unacceptable in a state governed by the rule of law.”
300,000 Euro fine against 1N Telecom
Furthermore, the report also contains drastic individual cases, such as that of 1N Telecom, a telecommunications company from NRW. 1N Telecom received a fine of 300,000 euros at the end of 2025. For a period of almost two years, the company sent personalized advertising letters that appeared to be from a well-known large telecommunications company. Despite hundreds of complaints and clear data protection violations, the company ignored the LDI's measures and did not change its business practices.
Social Media
Other cases concern the healthcare sector. Nurses have repeatedly been caught exhibiting care-dependent individuals online through Reels or livestreams – sometimes via Snapchat, sometimes as livestreams during night shifts. In none of the cases examined was the procedure legally permissible. A doctor's practice published images of a patient for breast augmentation simulation without consent – including a clearly legible full name by mistake. The image was accessible for ten hours via the Instagram account with several thousand followers; the LDI initiated fine proceedings.
A procedure involving the online sale of medicines also dealt with health data. As early as 2019, the LDI NRW had checked more than ten pharmacies in NRW and found that none of them obtained consent from the purchasers for the use of their data when selling prescription-only medicines online. The pharmacies argued that no conclusions could be drawn about the state of health for non-prescription drugs. The LDI and the courts see it differently: even order data for non-prescription medicines, combined with name and delivery address, are to be considered health data within the meaning of the GDPR.
A so-called insurance “data cartel” also caused a stir at the LDI: almost 40 insurers from Germany and Liechtenstein exchanged customer data via a common email distribution list for fraud detection, including health data and data of minors. And an online service provider had unlawfully passed on location data of users to third parties for years; the data eventually ended up with a US data broker.
Data breaches: cyberattacks dominate
The reports of data breaches also reached a new record with 2,844 cases (2,170 in 2024). Cyberattacks were the most common cause at 34 percent, followed by misdelivery (24 percent) and unauthorized disclosure (20 percent). A special audit of 33 university hospitals and clinics revealed: Twelve clinics reported that they were not aware of a single data breach in 2023 and 2024, which the LDI considers unlikely and points to flawed internal reporting processes.
Apple, Schools, and Digital Sovereignty
In schools, iPad usage remains a constant topic. Concerns regarding iCloud have not yet been fully resolved. For Gayk, end-to-end encryption of data stored in iCloud, where the keys are not held by Apple, is the most sustainable data protection solution. Until full clarification, the LDI advises against using iCloud. In addition, the authority welcomes the cross-state project “telli,” which is intended to enable schools to use AI language models in a data protection-compliant manner via pseudonymized accounts.
Criticism of centralization debate
A regular accusation from the business community that data protection supervision is inconsistent has triggered a discussion about the centralization of data protection supervision. Gayk considers this idea to be misguided. Anyone who wants fair and fundamental rights-compliant data processing should not tamper with federal data protection supervision, but should maintain local access to the review of data processing.
Part of nationwide development
The NRW figures fit seamlessly into a nationwide picture. In Hesse, the data protection officer Alexander Roßnagel reported an increase in complaints by 58 percent to 6,070 cases, while in Baden-Württemberg, Prof. Tobias Keber recorded an increase of over 90 percent to 7,673 complaints. A trend that the activity reports of both states document in detail. Several state authorities consistently name AI as the central driver of developments, both because complaints can now be formulated with AI support and because the use of AI systems in administration, police, and business creates new data protection risks. The reports also share concerns about the growing dependence on companies from the USA and China, the demand for greater digital sovereignty, and warnings against hasty legislative procedures that inadequately consider fundamental rights.
(mack)