Data Protection for Children in Digital Age: Data Protectionists Make Proposals

Data protectionists call for more protection for children in the digital space, proposing privacy safeguards and measures against manipulative programs.

listen Print view
Children on smartphones

(Image: TommyStockProject/Shutterstock.com)

5 min. read
Contents

On the International Day of Children's Rights, the Data Protection Conference (DSK) is calling for comprehensive improvements in the protection of children on the internet. The goal is to better protect children from the risks of digital data processing, especially in social networks, apps, and online services. The independent supervisory authorities are presenting ten concrete proposals designed to align the General Data Protection Regulation (GDPR) more closely with the digital lives of young people.

The Data Protection Conference calls, among other things, for a general ban on personalized advertising and profiling of minors, data protection-friendly default settings, and no possibility of consenting to automated decisions. The risk assessment for data protection breaches and impact assessments will also specifically focus on children in the future.

Children are particularly vulnerable, including online, emphasize the data protectionists. Especially in social networks, digital learning platforms, and apps, data of minors is often processed for commercial purposes without them or their parents understanding the implications. Therefore, the DSK is pushing for targeted legislative adjustments.

A central concern of the reform proposals: "Advertising based on children's personal or user profiles should – as already in the Digital Services Act – be generally prohibited." With this, the data protection authorities want to put a clear stop to personalized and behavior-based advertising for minors.

Furthermore, the resolution states: "Unlike adults, children should generally not be able to disclose particularly sensitive data such as information about their health, religion, or political opinions." This protection of sensitive information is intended to prevent children from leaving long-term digital footprints through thoughtless consent that could affect their lives.

Videos by heise

In the healthcare and counseling sector, too, data protectionists are calling for adjustments: "Children should be able to use counseling and health services confidentially from a certain age without their parents being automatically informed." This strengthens the right to privacy of young people – especially in sensitive life situations. Pediatricians and associations have also pointed this out in the past that the long-term storage of sensitive diagnoses – for example, in the electronic patient record – can affect children decades later.

Finally, according to the data protection supervisory authorities, children should "not be subjected to processes in which decisions are made entirely automatically." With this, the DSK clearly speaks out against algorithmic decisions without human control that could affect the lives of young people, for example, in rating systems, school apps, or online platforms.

"With ten proposals, the DSK aims to specifically strengthen the data protection of young people, for example, through bans on personalized advertising or child-friendly default settings in social networks. These regulations would meaningfully complement the existing protection framework of the General Data Protection Regulation and finally align it systematically with the special needs of children," says Meike Kamp, Berlin Commissioner for Data Protection and Freedom of Information and DSK Chairwoman 2025. The goal is to systematically align the GDPR with the needs and rights of children.

Paulina Jo Pesch, Junior Professor for topics such as data protection and AI, from FAU Erlangen‑Nuremberg, also warned at a data protection event that when training language models by Meta, personal content – including that of minors – is also used. Although the company claims not to use children's or adolescents' data, this promise is not really effective in practice. Many adolescents create their social media accounts with false ages, causing their photos and texts to be considered adult content and potentially included in AI training.

Pesch also criticizes that Meta has made objecting to the use of personal data complicated and opaque. Particularly vulnerable groups such as children and adolescents have no effective technical protection mechanisms or low-threshold objection options. Although the Consumer Advice Centre NRW sued against this, it was unsuccessful in court – also because "the court apparently did not know what was actually being trained." The researcher warns of an erosion of legal protection mechanisms under the pressure of the AI hype and calls for stronger enforcement of data protection.

(mack)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.