Platform supervision: Over 800 complaints and 4 proceedings for DSA violations
Since mid-2024, the Federal Network Agency has acted as platform regulator for Germany under the DSA. It provides comprehensive insights into its delicate work.
(Image: Tobias Arhelger/Shutterstock.com)
Many complaints, few proceedings – and no fines yet. This is how the first activity report of the Digital Services Coordinator (DSC) can be summarized. Since the Digital Services Act (DDG) came into force in mid-May, this coordination office at the Federal Network Agency has acted as the central platform regulator for Germany. Together with other competent authorities, such as the state media authorities and the Federal Data Protection Authority, it is responsible for ensuring compliance with the EU's Digital Services Act (DSA) in Germany.
In 2024, the coordination body, which has experienced teething problems and has been headed by Johannes Heidelberger, a digitalization expert at the Federal Network Agency, since July, recorded 884 complaints via the DSC complaints portal, according to the report now published. In 824 cases, these were submissions under Article 53 DSA, i.e. those indicating a possible violation of the Platform Act.
60 submissions via the complaints portal were not related to the DSA. The coordinator is not responsible for individual cases such as foul language or the sale of electrical appliances without a CE mark. It should only intervene if such cases occur frequently and there could be a "systemic failure".
No administrative offense proceedings yet
The platform supervisory authority forwarded two of the complaints under Article 53 DSA to the state media authorities. In addition, it sent a total of 87 relevant submissions to coordinators in other EU Member States. 83 of these went to Ireland, where most of the major US internet companies have their EU headquarters. The German DSC sent one complaint to the EU Commission by email.
The local coordinator also received seven complaints from DSCs in other EU member states during the reporting period. Two each came from Ireland and the Netherlands and one each from Finland, Austria and Slovakia.
By the end of 2024, the DSC had initiated a total of four administrative proceedings against service providers. Three of these concerned possible shortcomings in the duty to notify and remedy (Article 16 DSA), the justification of measures taken against users (Article 17 DSA) and the internal complaints management system (Article 20 DSA). The authority ended one of these proceedings last year because the service provider in question quickly rectified the shortcomings. The other two are still ongoing and investigations are continuing.
The fourth procedure relates to a service provider outside the EU. This provider failed to appoint a legal representative in the EU (Article 13 DSA). No administrative offense proceedings are pending, so the coordinator has not yet imposed any fines on big tech companies, for example. Furthermore, according to its own information, the DSC 2024 actively participated in EU Commission proceedings against AliExpress, Temu, TikTok and X.
Own reporting portal for justice and administration
The regulatory body also received 336 complaints on digital issues, which were answered or forwarded internally. Only some of these were DSA cases, in which the DSC referred the submitters to the complaints portal. The majority of these complaints concerned breaches of the imprint obligation or data protection, complaints about fraudulent websites, business models or service providers. There were also reports of subscription traps, problems with processing or canceling online purchases, misuse of telephone numbers and inconsistencies with access providers or when changing host providers.
Articles 9 and 10 of the DSA regulate what providers of intermediary services must do if they receive an order from national judicial or administrative authorities. This can be an administrative act or a court order to take action against illegal content. As soon as a provider receives such an order, it must inform the issuing or other designated authority whether and when it has implemented it.
The DSC's portal for the transmission of such orders has only been available to judicial and administrative authorities since November. By the end of 2024, 53 orders had been received via this portal. Most of these came from the state media authorities and complained about illegal content such as pornography, child sexual abuse material or hate and agitation in accordance with the Interstate Treaty on the Protection of Minors in the Media (JMStV). An order was issued because of the Bavarian Police Duties Act. The authors of the requests apparently did not always take a very close look at the content. The report states: "Some of the information consisted of generalized text modules."
Videos by heise
Dispute over trusted flaggers
There are also non-binding deletion and information requests ("referrals"). Last year, the state media authorities sent out 4225 such notices, which have no legal effect and initially have nothing to do with the DSA. However, if the providers concerned do not comply with these notifications, the media watchdogs can initiate administrative proceedings and issue a DSA order in this context.
A year ago, the DSC certified User Rights as the first out-of-court dispute resolution body. In October, it confirmed the REspect! reporting office at the Baden-WĂĽrttemberg Youth Foundation as a trustworthy whistleblower. In June, i.e. no longer in the reporting period, the coordinator also recognized the Bundesverband Onlinehandel, the organization HateAid and the Bundesverband der Verbraucherzentralen (vzbv) as "trusted flaggers".
The role of these whistleblowers is controversial, as they have raised concerns about freedom of expression and state control. Occasionally there are accusations of censorship and a lack of independence. However, the regulatory authority emphasizes that this is a prerequisite for acting as a trusted flagger. They are also "always organizations that have special knowledge and expertise in identifying and reporting illegal content". They aim to improve the efficiency and speed of removing illegal content. Furthermore, they cannot issue orders.
Personnel and material costs
According to the report, the personnel situation is not yet entirely rosy: according to the draft 2025 budget, a total of 47.8 posts would be available for the DSC's tasks. However, the financing of individual personnel costs and pro rata material costs for ten of these is not included in the draft 2025 budget. They could therefore not yet be filled. However, this should change with the 2026 budget.
The annual material costs of 1.7 million euros reported for the DSC in the DDG draft were provided by the Federal Network Agency via its 2024 budget. The funds were budgeted for the operation and further development of necessary IT procedures, the use of software and licenses, research, training, education, networking and the organization of conferences.
(mma)