Researcher: Bodily integrity should also apply to the digital space
The healthcare data market is set to grow to over 500 billion dollars by 2030. A paper shows why data protection awareness is so important.
(Image: Andrey Suslov/Shutterstock.com, Bearbeitung: heise online)
The digital collection and use of body data has developed into a booming market that is expected to grow to over 500 billion dollars by 2030. This is according to a paper by Mozilla researcher Júlia Keserű. At the same time, the associated risks have increased dramatically; according to Keserű, health-related cyberattacks in the USA alone have risen by almost 4,000 percent since 2010, and health data on the darknet has now exceeded the value of credit card data. This development is the focus of the current study "Skin to Screen: Bodily Integrity in the Digital Age" by Keserű.
Keserűs research shows how the mass collection of "body-related data" – from fingerprints and fitness trackers to digital health data – harbors considerable risks: data leaks, surveillance, discrimination, and exploitation by AI systems. The role of data brokers who trade in sensitive health and biometric data without the user's consent is particularly problematic.
As a solution, she proposes the "Databody Integrity Framework" – a holistic approach that aims to protect digital information about the body and psyche with the same human rights standards as physical existence. After all, people have an interest in the protection of their most sensitive data. The framework contains concrete recommendations for action at various levels:
- Redefining sensitive data in data protection laws
- Extension of health data protection laws to all health-related information
- User-friendly consent mechanisms
- Preference for platforms with strong privacy protections
(Image: Keserű)
In this interview, we talk to Júlia Keserű about the details of her research and the urgency of better protecting the "digital body", as well as the goals of her framework.
heise online: What was the motivation for your research?
Júlia Keserű: I've been working at the intersection of emerging technologies, human rights, and social justice for the past 15 years. One challenge I've noticed is how difficult it is to develop narratives around privacy that resonate with everyday people who may not be engaged with these issues. Privacy remains abstract for most; many don’t understand its relevance to them or their loved ones.
The goal of my project was to create compelling, data-driven narratives around why privacy matters. I began writing essays on what I termed "the unwanted touch of the digital age," discussing how intrusive technologies are becoming mainstream and integrated into our lives without us fully questioning their long-term implications. I've also observed the rise of AI and other emerging technologies since the pandemic, prompting me to research and gather evidence about the rise of what I call body-centric data collection.
How do you think public awareness and understanding of these issues can be improved?
While writing these essays, I noticed that these stories resonate with people. I was trying to convey that the harms associated with body-centric data collection can affect anyone, including your daughter or someone you love. This is no longer an issue limited to vulnerable communities; it impacts everyone. As parents grow increasingly concerned about technology's role in their children’s lives—especially with smartphones becoming everyday accessories—this message has struck a chord with many who might not have been engaged before.
Ten years ago, these concerns seemed like science fiction. However, in recent years, I've observed a shift: the issues we face are no longer hypothetical. We have clear evidence of real harm from body-centric data collection occurring across various areas, prompting more people to reflect on the importance of privacy.
What do you think about the European Health Data Space?
I find it quite alarming and confusing, and we likely won’t have a clear understanding of the broader impact for a while. The European Health Data Space (EHDS), but it seems to contradict the spirit of the data protection regulation by mandating a reckless amount of data collection, instead of minimizing the information these systems collect.
The tech industry is aggressively pushing forward, pouring significant resources into innovation, and politicians and policymakers are enthusiastic about the potential of delivering better services for their constituents. However, discussions often overlook the downsides beyond privacy concerns. Users and patients are left questioning why they should trust these systems, who will have access to their health data beyond their treating doctors, and what level of cybersecurity support—both educational and financial—implementing institutions will receive. While the research aspect is commendable, it is very unclear what rigorous standards and transparent consent mechanisms will be implemented to ensure that research is conducted ethically and with fully informed participation.
Innovation is great, but who gets to decide what price we pay? AI software, which is a key part of the plans, is also evolving, but privacy standards are pretty low and people don't care. They say, they have nothing to hide. If your child has any kind of pre-existing condition, that is going to increase their insurance costs and if that information gets leaked it will affect its whole life. The data will be there, and maybe it is easy to re-identificate the person behind the data.
We should slow down a little and regulate the industry. Otherwise, it will surely cause a lot of harm for our fundamental rights. But people don’t understand the possible danger. When you are trying to convince lobbyists and politicians who feel safe in their own world, it’s really hard to get the message through. The massive data collection makes lives of people who are already struggling even worse.
You are proposing a framework, can you please tell us more about it?
The framework draws on existing human rights documents, like the European Charter of Fundamental Rights, the International Covenant on Civil and Political Rights, or the Helsinki Declaration. My goal was to concretely explore how we might build on these conventions to apply the right to bodily integrity within the online space. A key element of this framework is the concept of "data body integrity," which I developed to refer to the inviolability of individuals’ online personas and their right to control the handling of data that reflects their unique physiological and psychological characteristics.
It’s crucial to recognize that our treatment of human subjects as objects of experimentation has not yet been adapted to the online environment. While sharing data is vital for scientific advancement, rigorous consent processes are essential for any research project. For instance, when mobile health companies share information about our fitness routines or mental health with research institutions. Our bodies are no longer merely data points; they are an extension of our physical selves, and any harm to that data reflects harm to us in real life. Through this framework and the concrete recommendations I provided for policymakers, civil society organizations, technology leaders, and individual users, I aimed to illustrate these connections and highlight the importance of treating bodily integrity with greater seriousness in the digital context.
(mack)