Sharp reckoning on the ePA: Kelber criticizes security and advertising campaign
Prof. Ulrich Kelber warns of Germany's decline in data protection. He sharply criticizes the security and implementation of the electronic patient record.
Depicted as "always wrong"
(Image: Volha_R / Shutterstock.com)
The digitalization of the German healthcare system, and particularly the electronic patient record (ePA), is described as insecure and intransparent by former Federal Data Protection Commissioner Prof. Ulrich Kelber. In a lecture at an event organized by the Free Physicians' Association, he accused politicians of squandering trust through superficial advertising and ignoring fundamental security standards.
Advertising instead of clarification and high opt-out rates
Kelber sharply criticized the Federal Ministry of Health's communication strategy for the ePA. Instead of comprehensively informing citizens about risks and necessary considerations, the ministry is relying on a "pure advertising campaign" that merely promotes the project as "super" and "great." Anyone who only tries "to persuade instead of convince," Kelber stated, will not get ahead in the long run.
This approach leads to a loss of trust, which is already evident in the numbers: opt-out rates against the ePA are unusually high at five to ten percent for an opt-out system. Furthermore, the group of convinced opponents who actively object is almost as large or even larger than the group of convinced proponents who actively use it. Official usage figures fluctuate significantly between three and twelve percent. According to Kelber, this also undermines the representativeness of the data, which is supposed to be so important for research. Moreover, the quality of the data, such as billing data, is often insufficient.
Technical deficiencies and unstable operation
Technically, the system is also far from mature. Kelber calculated that the officially stated operational stability of 96 percent means "one hour of downtime per day." These downtimes most likely occur not at night but under load during practice hours. He criticized Gematik's stance, which, while expressing dissatisfaction, refers to the responsibility of private service providers. This is "at least a gap in the system." For a state-controlled project, there must be effective sanctions against unreliable providers.
Videos by heise
In addition, secure access for insured individuals is systematically made more difficult, not only due to the complicated initial setup. Politicians have contributed to this. Among other things, by allowing health insurance companies to not send the PIN for the electronic health card (eGK). "In reality," according to Kelber, they "also let this slide because they don't really want it anymore in the long run." At the same time, the free PIN reset letter for the electronic identity card was abolished for cost reasons, which also makes this secure alternative unattractive. Instead, users are being pushed towards less secure methods like biometric login.
Hasty rollout without real testing phases
Kelber also criticized the fundamental development process of digitalization projects. He demanded that IT systems in healthcare finally be developed as a professional standard. "Pilot, evaluate, possibly go back, and only when the evaluation process is complete, then scale, i.e., roll out." Instead, there are hasty introductions that are declared as tests but are not. He cited the ePA test phase in model regions as an example. Immediately afterward, everyone was supposed to use them. Such an approach leaves no time "to check the results" or to correct errors. Given 25 years of neglect, Kelber said, another half year for a high-quality introduction would not matter.
Security vulnerabilities and "patchwork" mentality
The core of his criticism was massive security concerns. The ePA is not secure in its current state. Instead of fundamentally closing security gaps, they are often only "patched" superficially. Kelber complained that the security architecture is not made transparent and attacks carried out with the "resources of a state" are excluded from testing from the outset. This is unacceptable for a database containing the health data of around 70 million people.
It is particularly alarming that the keys for encrypting health data are held by the operators. Kelber explicitly named the providers here: IBM, which is subject to US law and must hand over data to US security authorities, and the Austrian company RISE, "which no longer receives contracts in Austria from the public sector due to its relationship with Wirecard and the Russian secret services." It is still unclear "whether this has really been completely stopped." That such providers have access to the keys is "not state of the art at all."
Kelber is also critical of the planned implementation of the European Health Data Space (EHDS) in Germany. In particular, he considers the decision to make the Federal Institute for Drugs and Medical Devices (BfArM) the central "Health Data Access Body" to be "unfortunate." The problem is a massive conflict of interest: Since the BfArM has its research interests, it would simultaneously decide on its research applications. Kelber warned against "someone approving their research applications within one institution." A clear separation of supervision and research is not unnecessary bureaucracy but essential for building trust.
As further evidence of the erosion of data protection, Kelber cited the planned EU Omnibus Law. This law is "even more dangerous" and an example of hasty legislation. Kelber sharply criticized the legislative process. To his knowledge, the data protection part was written "within five days" – without an impact assessment, without an evidence review, and without any debate or stakeholder participation. The result is a law that dilutes fundamental definitions: for example, data that has been anonymized should no longer fall under the General Data Protection Regulation, even if it could be re-identified by third parties later.
Furthermore, the definition of health data is being diluted. While the treatment data of an oncology patient remains protected, her mere presence data at the oncology center would no longer be considered health data – such data arises, for example, when booking appointments. Kelber found it particularly critical that AI training should generally become a legal basis for data processing – without further consideration, even for highly sensitive data.
Germany's descent into a "dark force"
While the planned European Health Data Space (EHDS) forces EU member states to a common minimum level of data security, Germany is currently below it. Kelber concluded his lecture with a grim allusion to J.R.R. Tolkien's "The Lord of the Rings": Instead of being a pioneer in data protection as before, Germany has become "one of the dark forces in Middle-earth" through such projects, actively trying to lower European standards. The journey into the digitalization of healthcare, Kelber said, echoing Bilbo Baggins, is "a dangerous business" where one must pay close attention to where one's feet are taking them.
(mack)