Child abuse: Internet hotline warns of dangerous dissemination scheme

In 2024, the FSM Complaints Office received a high number of reports of "ICAP sites" that led to sometimes unintentional dissemination of abusive material.

listen Print view
Hand taps on keyboard

(Image: Muhrfotografi/Shutterstock.com)

5 min. read

In the current annual statistics of its Internet Complaints Office, the Voluntary Self-Regulation Body for Multimedia Service Providers (FSM) draws attention to a risky pyramid scheme for the distribution of child sexual abuse material. According to the report available to heise online, which is to be published on Tuesday, a conspicuously high number of reports about "ICAP sites" were received last year. Those consciously or indirectly involved in this "Invite Child Abuse Pyramid" provided personalized, outwardly inconspicuous invitation links. If users clicked on them, the site operator would receive points to unlock more and more relevant material, which tended to be even more extreme.

Accordingly, such links were also posted under .de domains in the comment columns of social networks, "in order to gain as many credits as possible", writes the FSM. As an embedded video showing clips of the most serious acts of abuse in quick succession starts on ICAP pages as soon as they are accessed, internet users are "often unintentionally confronted with this content". In some cases, however, the self-regulatory body gives the all-clear: by the end of 2024, many of these services were no longer online.

According to experts, the pyramid scheme poses a threat to children and the general public. As a result, the commercial sites advertised through it often contain depictions of penetration, cruelty to animals and sadistic acts. The awarding of points tends to lead to an exponential spread of harmful content. As the links are distributed indiscriminately on social media, in chats and even seemingly harmless apps, there is a high risk that bystanders, including children, will be exposed to the material. The scheme is a lucrative business model for criminals. ICAP sites are considered difficult to combat as they can quickly switch from one hosting provider to the next and reappear under different domains.

In 2024, the FSM hotline received a total of 25,536 complaints about illegal online content or content harmful to minors. This represents a decrease compared to the previous year (2023: 30,573). Nevertheless, this was the second-highest number of reports since 1997. In 68% of cases (17,395 submissions), the three-person legal review team found violations of German youth media protection laws, which means that the complaints were well-founded.

General pornography that was freely accessible to children and young people without an age verification system (AVS) was once again the most common type of content reported. With 8529 cases, such content accounted for 49% (2023: 39%). The second-largest share was accounted for by depictions of sexual abuse of minors at 46% (8077 cases). The number of these complaints is declining compared to the previous year: in 2023, 12,918 cases accounted for 57% of substantiated complaints.

Videos by heise

The FSM attributes this in particular to the fact that there were fewer mass reports from foreign partner organizations in 2024, for example via the Inhope hotline network. According to the statistics, AI-generated depictions of child sexual abuse only accounted for a fraction of reports. However, users had complained about AI image generators that could be used to create such content. "Virtual child pornography" only accounts for 1.7 percent of all reports of abuse. In contrast, the Internet Watch Foundation recently reported a high increase in relevant AI-generated images.

The FSM immediately forwards information on abusive material stored on German servers to the Federal Criminal Police Office and also contacts the host provider in a notice-and-takedown procedure. In 2024, it took an average of 1.5 days (2023: 1.2 days) from notification to the complaints office to removal of the content. Overall, the removal rate in this area is 100%. In the case of abusive content hosted abroad, the hotline informs the host provider and also forwards the report to Inhope. The removal rate four weeks after the initial report was 93% (2023: 87%).

In particular, the auditors emphasized that "reports of videos with animal cruelty for entertainment purposes" had increased. They often show the killing of a certain species of primate, especially their young, in South-East Asia. They have been classified as "seriously harmful to development". Extremist content on Facebook, X, TikTok & Co. "was often coded", the FSM further notes. For example, they were hidden behind emojis and other images.

Compared to the previous year, the number of substantiated complaints in the area of hate crime increased from 120 to 222 cases. This corresponds to just over one percent of all justified complaints. Once again, the majority of complaints concerned depictions of symbols of unconstitutional right-wing extremist or Islamist organizations. This also included AI-generated content –, mainly in the context of the Gaza war.

(olb)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.