Digital Services Act: Trusted Flagger and freedom of expression
The DSA is currently the subject of heated debate. There are accusations that it restricts the diversity of opinion. What is actually true.
(Image: Shutterstock)
The Digital Services Act is currently causing quite a stir –, partly due to misunderstandings and misinterpretations of the legal text or the accompanying German legislation. Some of this is being spread deliberately in order to brand the law as a censorship measure, although in some cases it is actually the opposite.
Is freedom of expression being restricted?
Yes – but not through the DSA. Freedom of opinion and the so-called freedom of expression are a valuable asset that is particularly worthy of protection under both the German Basic Law and the European Charter of Fundamental Rights. This is due to its high value for democratic discourse. But it is not unrestricted: it is limited, for example, by insults, defamation or threats of violence against others. It can also be restricted by other rights, such as copyright – when sharepics are created using other people's material.
Above all, freedom of expression is a fundamental right vis-Ă -vis the state. However, as most online content does not originate from government agencies, the issue here is the indirect effect of this fundamental right on private actors. The Digital Services Act binds private operators in two ways: firstly, by not requiring the pre-moderation of user content. On the other hand, providers are exempted from the content liability that would otherwise apply to them. Condition: Possibilities for alleged illegal content to be brought to their attention ("notice-and-action"). The second commitment in the DSA: simply deleting is not permitted. Every moderation decision must also take appropriate account of freedom of expression. This means, for example, that otherwise punishable content that remains permitted in exceptional cases, such as satire, must still be allowed. Arbitrary reasons for deletion in the terms and conditions are also inadmissible under the DSA (Article 14).
The law follows the principle: with size comes responsibility – Platforms with more than 45 million users in the EU are subject to more regulations than smaller sites.
What does the DSA require?
Illegal content is also prohibited on hosting services. The DSA requires all hosting services with user-generated content to deal with content that is brought to their attention as suspected illegal on the basis of the applicable laws. To this end, they must provide easily accessible reporting channels.
Providers are not obliged to delete or block content solely on the basis of a report. They must make their own decision as to whether content actually violates the law. As a rule, they also check whether there may be a breach of the provider's terms of use. It may be that content is not illegal, but a platform, forum operator or hoster nevertheless blocks or removes the content – with reference to its own terms and conditions, the contract with the user.
Some providers are suspected of processing these decisions in such a way that they do not incur any costs. If they assume that a user will not respond anyway, they block quickly. However, blocking content that is actually legal may then be a breach of the contract between the provider and user.
Videos by heise
What rights do users have?
Every user enters into a contractual relationship with the respective provider. The DSA even expressly improves the legal position of users: arbitrary blocking is expressly prohibited. And surprising clauses in terms and conditions, which could also be used as a basis for blocking, have always been prohibited under consumer law. This would include absurd clauses such as a general ban on posting cat pictures or expressions of opinion. Under the DSA, users of large platforms even have the express right to lodge a complaint against a block or deletion. And they can demand a justification.
The DSA also provides for the possibility of appealing to a kind of arbitration board in the event of disputes. These must also be recognized by the Federal Network Agency, and one already exists in Germany. However, it only works with selected platforms. Their decisions are not legally binding and do not deprive the person concerned of the option of going to court anyway. Civil action can also be taken against allegedly unauthorized deletions in court.
In the case of the particularly large online platforms, the so-called VLOPs, the EU Commission also considers potential overblocking to be a "systemic risk" – just like ignorance of illegal content.
In addition, the EU has introduced another hurdle that critics have hardly noticed: National criminal provisions must be in line with EU law. In other words, if the traffic lights were to decide tomorrow to include the offense of changing light signals in the criminal code, this would be a material breach of European law. Providers would be allowed to ignore the criminal provision and enforcement attempts, but would have to be prepared for legal disputes.
What happens to reports from trusted flaggers?
Trusted flaggers, or "trustworthy whistleblower organizations" in legalese, are considered to be particularly qualified organizations that have previously undergone a corresponding examination by the national DSA supervisory authorities such as the Federal Network Agency in Germany. They must prove that they carry out thorough checks before passing on reports to platform operators. If they are found to be negative, their status is revoked.
Trusted flaggers fulfill an important function for users who do not know how the process works, but also from the perspective of the platforms: They create reports more professionally -- because they know how they work, what the platforms need to make a decision and can clarify in advance if reports are futile.
The consequence of a report to the provider by a "trusted whistleblower" is that the platform provider is obliged to check reports as quickly as possible. This is the law, and the reports are also preferable to other reports. Moreover, trusted flaggers such as the first body recognized under the DSA in Germany are not a new invention, but a long-standing practice in the area of youth protection, for example on YouTube.
In fact, trusted flaggers are just another of the many mechanisms used to enable the reporting of potentially illegal content. Other legal frameworks are applied much more intensively –, such as the youth media protection powers of the state media authorities or the so-called Terrorist Content Online Ordinance. Under this regulation, law enforcement authorities, in Germany the BKA, send reports or specific deletion requests with a short processing period to the operators.
Is this denunciation?
In fact, only one thing happens with most reports: the moderation decision is made by the platform operator. Article 18 of the DSA expressly stipulates when the investigating authorities must be notified: On suspicion of a "criminal offense that poses a threat to the life or safety of a person or persons." All other criminal content only reaches the authorities if it is brought to their attention by users, operators, trusted flaggers or other parties. In other words: Prohibited hate speech on TikTok, YouTube, X or Instagram does not mean that the operator would have to report it to the BKA with user ID or other data, even if he considers a report to be correct.
(nen)