re:publica: DSA "no magic wand" – Supervision warms up
The Digital Services Act is now having a noticeable impact. In Berlin, a person responsible at the EU has now provided an initial interim assessment.
The Digital Services Act is slowly taking effect. The EU regulation, which has applied to all providers in the EU since mid-February, sets out the basic rules for how platform providers, hosts and providers should handle content. Regulations on online marketplaces are also included. But what about enforcement after the first few months?
The first few months of the DSA were marked by several events at once – barely had the largest providers come into force when a storm of hate postings and violent content was already raging across some platforms in the context of Hamas' attack on Israel and Israel's subsequent military response in the Gaza Strip.
No law for individual content
Prabhat Agarwal, Head of Unit at the responsible Directorate-General Connect, drew a positive interim conclusion and explained at re:publica how the DSA works – and what it is not: legislation for individual content. Instead, the DSA regulates how providers deal with content distributed by customers via their systems on a more abstract level.
A basic distinction is made between normal providers and the largest providers, which have more than 45 million users per month. The latter must comply with special obligations – for example regarding content moderation or dealing with so-called systemic risks such as attempts to manipulate elections. In the run-up to the European elections, this is currently a hot potato between providers and supervisory authorities.
For the largest providers, this is the EU Commission itself. Its intervention ensured, for example, that Meta Facebook put the Crowdtangle analysis tool back online on Monday for the European elections, even though it had already been shut down.
German supervisory authority now operational
However, after some delay, the German supervisory authority has now also been operational for a few days after the Digital Services Act came into force and an independent body was set up for this purpose at the Federal Network Agency. Although the national supervisory authorities are primarily responsible for the smaller providers, they work together with the EU Commission and other competent authorities.
In order for enforcement to work at all, companies must provide information and respond to inquiries. As a rule, this works, explained the head of the DSA unit in Berlin. "Even the porn platforms have generally sent us reasonable documents," reported Agarwal. "What we see is that platforms often know what risks there are - and also that their measures have not been sufficient so far."
Far-reaching supervisory powers
The powers of the supervisory authorities are very far-reaching under the DSA - and extend to internal databases, algorithms and correspondence, witness interviews, including with service providers and other potential respondents. The DSA stipulates that, if necessary, access must even be ensured through coercive measures - in extreme cases, the Irish police could seize servers by way of administrative assistance and hand them over to the DSA supervisory authorities.
In contrast to other areas of regulation, such as banking regulation, the services subject to the DSA are, however, very diverse – which is why the so-called transparency reports also differ greatly. In addition to social media platforms, the DSA also applies to entirely different offerings. For example, online marketplaces such as the major providers Amazon, Zalando and Aliexpress are covered, while Shein and Temu have now also exceeded the critical user thresholds and are considered major providers within the meaning of the DSA. New guidelines are intended to ensure greater uniformity.
Blocking only as a last resort
At present, the major social media platforms in particular are primarily concerned with so-called systemic risks, for example when it comes to illegal user content. The reporting procedures would be very similar to those of the old German Network Enforcement Act, but without the 24-hour processing period, explained Agarwal. The DSA will also be used as an internal argument by companies if they have not yet seen the need to listen to their responsible teams. According to Agarwal, they are often aware of the problems, but the threat of punishment under the DSA would increase the priority. Nevertheless, the DSA is not a magic wand - not all problems can be solved with it or in one fell swoop.
However, the DSA is expressly only available as a last resort in the form of a judicial blocking order if all other means have failed. However, blocking an entire platform, as is being discussed in the USA about TikTok, would be a very far-reaching measure. However, the US discussion is primarily about security concerns, which is not at the heart of the DSA.
X has not yet convinced supervisors
"At the end of a procedure, there is usually a fine", explained Agarwal, using the example of the EU Commission's action against X, formerly Twitter. The company is the focus of regulators due to problems with content moderation of illegal content, its approach to so-called community notes, data access for researchers and ad transparency. Despite complaints from the Commission, X had further reduced the number of content moderators with national language skills in the EU before the European elections.
"We are all convinced that things are not working well," said Agarwal, adding that X had also recognized this. Elon Musk's X could now commit itself to measures to address the concerns. However, if these are not complied with, massive fines could follow in the next step. The proceedings are currently proceeding constructively, but it is clear that there are major problems with the systemic risks.
TikTok-Lite proceedings, an example of the ability to act quickly
As an example of how the DSA can work, Agarwal cited the intervention of the EU Commission as the supervisory authority at TikTok-Lite. A few days after a request from the EU Commission, TikTok had reduced the controversial TikTok application, which had only been introduced in Spain and France and was suspected of being highly addictive, by adding a reward function. However, the DSA also brings new user transparency, according to Agarwal - for those affected by content blocking, for example. Here, too, the providers still have some work to do: They would have to provide users in such cases with information about the blocking and a possible appeal against it.
Proceedings are currently being conducted against TikTok, AliExpress, X and Meta. Presenter Markus Beckedahl asked the head of the DSA Enforcement Unit how long it would take for these to be concluded: "I don't think we're talking about five years," said Agarwal, alluding to the long duration of GDPR proceedings, for example. "Thierry Breton is a very, very impatient person." Nevertheless, the necessary procedural steps must be adhered to.
The role of researchers and civil society still unclear
Incidentally, the companies themselves will have to finance the supervision: As with banking supervision, they have to pay fees depending on their size and turnover. Like the EU Commission, the supervisory authorities can also call in external experts for analyses if necessary. According to Agarwal, another key element of enforcement is that researchers can register with their national regulatory authorities to gain access to data from companies and carry out their own research on internal company data. Researchers in Germany will be able to submit an application to the Federal Network Agency in a few weeks.
Another open question is how civil society, which is actually intended to play a strong role in the DSA, could be strengthened. In the area of porn platforms, for example, there are only a few players, explained the Commission official. Among other things, the DSA provides for civil society actors to act as so-called "trusted flaggers" – i.e. as trustworthy whistleblowers for the platforms, whose reports are prioritized. However, there is still no funding model for this and the DSA does not contain any specifications, meaning that actors such as the German NGO Hate Aid see a number of problems in fulfilling this role.
(olb)