Meta and X in the EU – This is regulated by the DSA

The Digital Services Act (DSA) does not regulate what is illegal. It requires mechanisms against illegal content. Fact checkers don't have to be.

listen Print view
Digital,Contents,Concept.,Social,Networking,Service.,Streaming,Video.,Global,Communication

(Image: metamorworks / Shutterstock.com)

5 min. read
Contents

While Meta is abolishing fact checkers in the USA, this does not apply to the EU for the time being. In Germany, fact checkers from Correctiv and dpa continue to work for the company. When asked by heise online whether changes are also planned here, Meta pointed out that there are no such plans for the time being. In principle, however, Meta could also "get rid of" the controllers here, as Zuckerberg puts it. However, it would then have to find other ways to ensure that it complies with the DSA.

Elon Musk is already at loggerheads with the EU to a certain extent. It is threatening to fine X. However, this is less about the question of freedom of expression and moderation and more about the lack of transparency obligations, access and dark patterns. The DSA also regulates such issues.

The law does not regulate which content is illegal. This is set out in other regulations at national or EU level. However, the DSA determines how this illegal content must be dealt with as soon as it comes to light. "It contains an EU-wide framework for the detection, labeling and removal of illegal content as well as risk assessment obligations."

Videos by heise

This also means that Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) such as Meta and X must identify risks. Such illegal content, disinformation and risks for minors must then be systematically combated. How providers do this is up to them. Fact checkers are just one option. Nevertheless, automated filters can also be sufficient to minimize the risks in terms of the DSA. In order to prove this, corresponding reports are required, which must then be submitted to the EU. In principle, it would also be possible for Meta to end its cooperation with the fact-checkers in Germany.

The enforcement of the DSA is monitored by the EU Commission, but there are also national coordinators – in Germany this is the Federal Network Agency. Incidentally, proceedings are already underway against almost all major platforms – from LinkedIn to Google via Apple and Alibaba to Amazon and Booking. However, these have so far been limited to requests and exchanges. Before penalties are imposed, providers can defend themselves and make changes to their services.

The obligations that providers of large platforms must fulfill also include ensuring that users have fair and transparent channels for complaints. For example, everyone has the right to officially appeal a decision if, for example, their account has been blocked. Platforms must not block arbitrarily.

Nor may content, i.e. posts, be deleted just like that. The DSA requires defined criteria for the deletion of posts, which must be communicated transparently. Shadow banning is also only permitted if the user concerned is informed – and can lodge an objection. This means that people who receive less visibility, i.e. are excluded by the algorithms, must be told why this is happening to them.

Incidentally, fact checkers are not responsible for blocked accounts or downgraded posts and people, but rather other mechanisms. These include reports from other users and automated filter systems. Community notes are one form of reporting on X. Users can write opinions, ratings or classifications of the content under other people's posts. Meta is also planning to introduce such a function.

It has also become known that the internal rules for automated moderation have been changed in the USA. Numerous statements that were previously prohibited or undesirable are now permitted. This applies to topics such as LGBTQA+, Jews and minorities. In the USA, freedom of speech is a broader concept than in Germany, for example, where it more quickly comes up against offenses such as insults. The platform operator determines what is also permitted or prohibited within the legal framework.

The DSA also provides for a number of regulations concerning the integrity of elections. Large platforms must adapt their recommendation systems, label political advertising and assess and counter systemic risks here too.

(emw)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.