50 million times right: How the DSA breaks the arbitrariness of platforms
Almost every 3rd complaint about deletions and bans was successful in the EU. This is how the DSA affects freedom of expression and control after two years.
(Image: Ivan Marc / Shutterstock.com)
In the digital world, an algorithm often decides in milliseconds about the visibility of opinions or the existence of digital identities. However, the image of the powerless user standing before closed support gates has changed. Two years after the launch of the Digital Services Act (DSA), the EU Commission is drawing an impressive balance: almost 50 million times, online platforms had to correct their original decisions and reinstate content or accounts. The Brussels government institution sees this as proof that the era of uncontrolled platform arbitrariness in Europe is ending.
Users are asserting their rights so massively because of the new, legally enforced complaint procedures. Previously, objections to a ban often went unheard. Today, platforms are obliged to explain every moderation decision transparently and to offer an internal review process.
Of the total of 165 million cases in which users took this route, platforms backed down in around 30 percent of cases, reports the Commission. This corrects a stark imbalance: a significant portion of what systems flagged as violations turns out to be legitimate content upon closer inspection. Remarkable: In the first half of 2025 alone, almost all interventions (99 percent) were not due to illegal content and DSA requirements, but because users allegedly violated the often vaguely worded house rules of the corporations.
Arbitration bodies as a corrective
For particularly persistent cases, out-of-court arbitration bodies have proven to be a turning point. In the first half of 2025 alone, over 1800 conflicts with giants like TikTok, Instagram, or Facebook were negotiated there. The fact that users were proven right in more than half of these concluded proceedings (52 percent) shows, according to the executive body, how necessary these independent bodies are. Instead of years of expensive court proceedings, users receive quick and often free correction of their digital restrictions here.
These mediation bodies function as a much-needed counterweight to the corporations' automated systems, praises the Commission. The latter now make half of all moderation decisions without human control.
The DSA deeply intervenes in the architecture of business models and prioritizes the protection of vulnerable groups. The government institution celebrates the strict ban on targeted advertising for minors, which has been in effect throughout the EU since 2024, as one of its most significant achievements. This means that young people are no longer the target of data-hungry algorithms that analyze their behavior to set tailor-made purchase incentives. This measure is considered a building block for the digital well-being of a generation that was previously more exposed to the commercial pressure of the platform economy.
Videos by heise
Safety on marketplaces and protection against illegal goods
In parallel, the legislation has tamed the Wild West of online trading. Online marketplaces are now obliged to actively combat the distribution of illegal or dangerous goods. The traceability of traders has been improved. This makes it more difficult for dubious providers to operate under the radar. Should a user nevertheless have purchased an illegal product, the platform operator is obliged to inform them as quickly as possible and offer concrete possibilities for redress. This shifts responsibility to where the profits are generated: to the operators of the infrastructure.
Another pillar of the DSA is access to science. Researchers and civil society gain insights into the internal processes and moderation practices of Big Tech companies like Elon Musk's X for the first time. This transparency is not an end in itself. It is considered a prerequisite for being able to effectively hold platforms accountable at all. By accessing data that operators previously guarded as trade secrets, the expert public can independently examine how algorithms influence public discourse and where systematic errors occur.
Transparency: Three billion justifications
This verifiability is technically underpinned by the central DSA transparency database. Hosting providers must justify every single moderation measure and feed it into the system. With currently over 3.6 billion justifications stored, a huge archive has been created that documents the practices of 266 active platforms almost in real-time. Analysts can precisely track which violations – such as fraud or the sale of non-compliant products – are most frequently sanctioned. This usually leads to the deletion or blocking of access, but the sheer amount of data makes it possible, for example, to uncover sources of error in AI moderation.
The “Platform Basic Law” has thus initiated a development that extends far beyond Europe. According to the Commission's interpretation, it proves that democratic control of the digital space is possible without stifling innovation at its core. The Trump administration accuses the law of being an instrument for state censorship. The numbers suggest the opposite: the DSA appears to be an effective means against private censorship by tech companies.
(mki)