IT security expert from KIT: "I'm significantly more nervous about health data"
Health data should be made broadly usable. But experts warn: even military organizations or intelligence agencies regularly lose data.
Moderator Alexa von Busse, Dr. Christoph Kollwitz, Prof. Thorsten Strufe from KIT, and lawyer Prof. Hans Hermann Dirksen discuss the protection of health data at the Anosidat conference.
(Image: Marie-Claire Koch / heise medien)
The use of health data is one of the central future topics for European governments. In the course of establishing common European data spaces – for industry, mobility, finance, and agriculture, for example – the European Health Data Space, which came into force in March, is to mark the beginning.
At the Anosidat specialist event, experts from IT, industry, and law discussed the opportunities and risks of large-scale, secure data use – especially in healthcare. A widespread narrative was also questioned, namely that data protection hinders or even endangers medical progress. Prof. Thorsten Strufe, an IT security expert from the Karlsruhe Institute of Technology (KIT), criticized this: "What sometimes scares me a little in this context is hearing statements like: 'Data protection kills people.'"
Experts urge caution with health data
"I asked myself why we are starting precisely with health data, where the sensitivity of the data is greatest. We could first gain experience in other domains – in mobility, weather data, energy consumption – to understand whether and how anonymization works at all," said Strufe, who is also responsible for the EU-funded project SynthiClick (Anonymization and Synthesis of Click Paths and Behavior on the Web). The combination of biometric or long-term collected vital and movement data could enable unambiguous identification of individuals, even if it is formally anonymized. "I am significantly more nervous about health data than about other datasets," said Strufe. "We cannot assume that biometric time series such as heart rate patterns or movement patterns will remain permanently anonymous."
According to Strufe, there needs to be an honest engagement with the technical reality: "We all claim that we anonymize perfectly and that all data is GDPR-compliant; [that] is not really true in 98 percent of cases." He finds it naive to assume that the data is protected securely enough: "Even military organizations or intelligence agencies lose data every year – and we believe that won't happen to us in healthcare?" said Strufe. He does not assume that the Federal Institute for Drugs and Medical Devices is better protected. It is only a matter of time before something happens. Therefore, he demands honesty and transparency from those responsible.
Fully effective anonymization of medical data is currently not possible. Nevertheless, there are approaches using mathematical methods such as differential privacy to protect personal information by "noising" data and thus preventing inferences about individuals. He also warned against blind trust in technologies such as homomorphic encryption or the use of synthetic datasets. The key is to form aggregate data and protect it statistically, rather than analyzing individual people.
Videos by heise
Data potential and practical hurdles
For Dr. Christoph Kollwitz, Chief Product Officer at the Leipzig-based company Docyet, the potential of health data is not yet being fully exploited. "The data is usually available, but it is not accessible, not reachable, and not with the right person in the right place to provide decision support," he explained. Lack of structure, incompatible formats, and isolated systems prevent existing information from being used to improve care.
Kollwitz explained how digital applications can help to capture patient needs in a structured way, optimize care pathways, and use health data securely. The goal is individual, data-driven patient care while simultaneously protecting sensitive information.
Legal perspective
Prof. Hans Hermann Dirksen, a lawyer and attorney at Liebenstein Law, sees no fundamental conflict between data protection and data use: "The large-scale use of health data is not a security risk per se, but a design task."
He emphasized that transparency, clear responsibilities, and trust are the prerequisites for socially accepted data use: "The population is generally willing to donate and provide data." Certain preconditions must be met to build trust, exclude misuse, and ensure transparency and traceability. "If you look at the broad consent of the Medical Informatics Initiative – I say: it is completely illegal. But we are not afraid because we think it's good. And something is actually coming about – that's the point," said Dirksen.
For legal clarification, Dirksen referred to the European Health Data Space (EHDS) and the Health Data Use Act (GDNG). These are intended to enable secure research in so-called protected data spaces in the future. Researchers will have access to pseudonymized data without it leaving the data space. The idea of data donation is correct, but the legal basis for it is insufficient. Legally compliant processing can only be carried out in the future via data trustees, as provided for by the Data Governance Act. These are intended to manage datasets, grant controlled access, and ensure the traceability of research.
Waiting for European guidelines on anonymization
Regarding anonymization itself, a uniform legal standard is still lacking. Dirksen pointed out that research institutions and companies have long been waiting for binding guidelines from the European Data Protection Board: "It would be wonderful if the European Data Protection Authority would finally say how it legally defines anonymization. We have been waiting for that for many months."
He wishes for more clarity on when anonymization is legally tenable and what criteria must be applied to conduct research legally. "In terms of effort, it must always be more expensive to decrypt the data than it is ultimately worth," said Dirksen's approach. He sees this as a realistic benchmark for reconciling data protection and data use – sufficiently secure, but not innovation-hostile.
Trust as the key
Kollwitz called for more interdisciplinary collaboration in the development of data-driven health solutions: "Data protection is often discussed purely from a legal perspective. But we also need developers, doctors, and process experts at the table."
Strufe agreed. Research, technology, and law must share responsibility and deal openly with uncertainties instead of promising perfect security. Only then can the necessary trust be built, which motivates citizens to share data. Data protection is not a research impediment, but the prerequisite for research to remain socially acceptable.
High-Tech Agenda Germany
The use of health data will also be driven forward in the future within the framework of the Hightech Agenda Deutschland (HTAD), which was presented by the federal government in mid-2025. It is intended to secure Germany's technological competitiveness in six key future fields – including biotechnology and artificial intelligence, which are directly linked to health research.
The planned launch event, under the leadership of Federal Research Minister Dorothee Bär, will also address health topics. Priority will be given to real-world laboratories and the "adoption of the Research Data Act," the "further development of the Health Data Use Act and its connection to European data spaces to better exploit the innovation potential of data for research, society, and the state," as outlined in the High-Tech Agenda.
(mack)