Youth Protection 2025: Sharp Increase in Reported Abuse Material Online
The FSM hotline's annual statistics show a worrying increase in illegal content and challenges from AI-generated abuse material.
(Image: M-Production / Shutterstock.com)
The digital world holds numerous dangers for minors, as confirmed by the current annual report of the Voluntary Self-Regulation for Multimedia Service Providers (FSM). With a total of 28,598 incoming reports, the organization's online complaint office recorded the second-highest number of notifications in 2025 since its founding in 1997. Not only the sheer volume is significant, but also the accuracy of the reported content: in 74 percent of cases (21,072 reports), a comprehensive legal individual case review confirmed actual violations of German youth protection laws.
This means the rate of justified complaints increased by six percent compared to the previous year (68 percent). The FSM experts attribute this primarily to a massive increase in the area of child sexual abuse material (CSAM).
Classic problem areas such as pornography, hate crime, and violence have even slightly decreased in the statistics for 2025, which were published on Friday. In contrast, CSAM has become by far the largest category, accounting for 58 percent of justified cases. The FSM also points to the rise of virtual abuse material, which already makes up around 19 percent of CSAM reports with 2,332 registered cases.
These include, in addition to classic manga-style drawings (Hentai), increasingly content generated with Artificial Intelligence. According to the organization, this development presents entirely new challenges for youth protection, as the lines between reality and fiction blur. However, the legal assessment in Germany remains clear: even virtual depictions created with AI are impermissible and punishable.
Hosting Locations Complicate Deletion
In parallel with content shifts, the FSM observes a significant relocation of hosting sites. While in 2024 more than half of the reported illegal content was stored on German servers, this share dropped to only about 23 percent last year. Servers in the USA, Malaysia, and Great Britain are now leading the way.
This internationalization makes legal enforcement through deletion requests more difficult, as content in countries like the USA or Canada is often not prohibited. Cooperation with international partners, such as the hotline network Inhope remains essential but is reaching its limits.
Domestically, however, the complaint office proves highly effective. Despite a slight increase in the average processing time to two days, 99.68 percent of illegal content on German servers was removed within a week. This rapid remedy is a core pillar of the German youth protection system, which is primarily based on the Interstate Treaty on Youth Media Protection (JMStV). The FSM, with its team of lawyers, acts as a link between concerned users, hosts, and law enforcement agencies.
Videos by heise
Decrease in Hate Crime
A positive trend can be observed among FSM member companies, which include giants like Amazon, Microsoft, Google (YouTube), Meta (Facebook/Instagram), and TikTok. The number of justified complaints against platforms that have voluntarily committed to self-regulation dropped significantly from 4849 cases in the previous year to only 430. Furthermore, immediate remedies were possible for all these incidents.
This indicates that self-commitment and the implementation of technical protective measures are increasingly effective. In other areas, such as hate crime and incitement to hatred, case numbers have also noticeably decreased. Nevertheless, the overall picture remains challenging: the combination of rising reporting numbers for abuse material and technological leaps through AI requires a continuous and sustainable funding approach, according to the FSM, to continue effectively protecting children and adolescents. The parallel complaint office of the eco association recently reported a record number of justified complaints.
(mma)