Digital Services Act: The Trusted Flagger system still has issues

Under the Digital Services Act, Trusted Flaggers are supposed to help platforms clean up. But there is still a lot of room for improvement, as insights now show.

listen Print view

(Image: Cristian Storto/Shutterstock.com)

8 min. read
Contents

How to deal with illegal advertising, harmful products, fake shops, and other prohibited content? The dispute over how to handle illegal content is one of the major ongoing debates of the digital age. With the triumph of the platform economy, a few players have taken on a special role: the Temus, Amazons, YouTubes, and TikToks of this world concentrate large parts of content, services, or product offerings, even if they actually come from third parties. To avoid being held liable for their content, the so-called liability privilege has existed since the 1990s: as long as the operators adhere to the given rules and, if necessary, check whether content might be illegal upon notification, they are not legally responsible for it. The European Digital Services Act (DSA) brought a major legal update to this system in 2024.

The providers buy themselves out of liability with a promise: If someone sees something, we'll take care of it. “Notice and Action” is the principle according to which anyone can report possible violations. However, there is still a lot of room for improvement with the DSA: The reporting mechanisms for this and the providers' reactions are already the subject of various proceedings by the EU Commission against the particularly large platforms under the DSA. And an investigation by the Federation of German Consumer Organisations (vzbv) recently showed that the reporting channels for consumers on the largest platforms and marketplaces by no means always comply with the DSA rules.

During the legislative process, some platform operators also pushed for certain minimum standards to be met when content is removed. This is because providers fear little more than being flooded with unjustified requests. Google's subsidiary YouTube therefore introduced the concept of so-called Trusted Flaggers many years ago: trusted informants – who could report rights violations, youth protection violations, or similar issues. The EU legislator took up exactly this idea in the DSA. However, critics have criticized the construct as an opinion police since it came into force. Trusted Flaggers, who are certified in Germany by the Digital Services Coordinator at the Federal Network Agency, have one thing above all: a privileged reporting channel to the providers. Because in Article 22 of the DSA it states: Providers must process these reports immediately and with priority.

Videos by heise

And there are good reasons for this: where expert professionals report, special urgency is often required to prevent greater damage. In other EU states, there are Trusted Flaggers who, for example, detect financial fraud, identify child abuse material, or search for online piracy. In Germany, only four Trusted Flaggers have been appointed so far. While two HateAid activists faced US sanctions and the first Trusted Flagger, the Baden-Württemberg organisation “REspect!”, were subjected to hate and ridicule and defamed, the other two organisations are less well-known and controversial for this status. The Federal Association of Online Trade, for example, is certified as an informant for the protection of intellectual property, fake product reviews, or unsafe products. The Federation of German Consumer Organisations is also relatively close to this. Since June 2025, it has been considered a trusted informant for very specific areas related to consumer protection.

For consumer advocates, this is a new instrument – because until now, they have only had the option of enforcing consumer protection through the arduous legal route or with admonitory words. “The function as a Trusted Flagger complements the vzbv's options with another instrument – if it works, it also saves providers some effort,” says Lina Ehrig, head of the Digital and Media team, who also sits on the advisory board of the German supervisory authority at the Federal Network Agency. But it doesn't look like that everywhere yet.

Because the consumer advocates have encountered various problems that they themselves did not expect. One problem, for example: a provider who, as an alleged sanctuary, was on the verge of bankruptcy, offered handmade cow shoes for 25 euros – available on Temu for 3 euros. According to consumer advocates, Facebook took 7 days to react. It is difficult to determine how many cow shoes the tear-jerking animal welfare fraudsters had sold by then. And well-meaning users are left with the damage.

Some providers make it particularly difficult for those reporting. In particular, advertisements for fake offers are hardly effective to report, complain the consumer advocates. Created automatically, the Trusted Flaggers are supposed to report them manually and with a lot of effort. With an online form limited to ten entries, for example. From a consumer protection perspective, this is outrageous, says Lina Ehrig: “We expect marketplaces to prevent the appearance of core identical content if it has already been classified as inadmissible.” Companies should have an interest in their offers not being misused, according to the consumer advocates' perspective.

But their reports sometimes go unheard: While Instagram, Amazon, Facebook, and eBay generally reacted at least to the Trusted Flagger report, none of three reports to AliExpress in 2025 led to a deletion. Even though one of the products complained about, a children's toy, is absolutely banned from sale in the EU. Consumer advocates consider such behavior to be obvious violations of the DSA.

One of the consumer advocates' causes for concern with the license to report is, of all companies, the one that originally invented the Trusted Flagger system. “We have filed a complaint with the Digital Services Coordinator about Google because, in our view, this is not DSA-compliant behavior,” says Dennis Romberg, who is responsible for market surveillance in the digital sector. His team has already made more reports in 2026 than in the previous year, but for the first time has also forwarded cases to legal enforcement for further processing.

In other words: despite the new instrument, the old instrument of warning letters will probably have to continue to ensure the removal of illegal content – with corresponding effort and time delays. An inquiry from heise online to Google about how they assess the functioning of the Trusted Flagger preference remained unanswered until Thursday evening. No statement was received from Temu and TikTok by the evening either, regarding their experiences with the trusted informants.

New guidelines for the Trusted Flagger system are currently being discussed in Brussels. “The EU-level guidelines would be a good place to issue clear requirements for providers on how they must receive and process reports,” says consumer advocate Lina Ehrig. Currently, each provider seems to have its own ideas on how Trusted Flaggers should submit their reports. In other legal areas, such as the Product Safety Regulation, providers are required to provide legally defined online interfaces – making more precise specifications here than the DSA legal text does is likely to be of great importance for practical effectiveness.

(mho)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.