3 years of DSA: researchers launch "mass enquiry" on Facebook, X & Co.
The Digital Services Act (DSA) turns three. Civil society organisations are now demanding a daily overview of viral posts from large platforms.
(Image: Cristian Storto/Shutterstock.com)
4 October 2025 marked the third anniversary of the EU Council's formal approval of the Digital Services Act (DSA). The civil society organisation AlgorithmWatch sees the law, which imposes stricter rules on online platforms, as a groundbreaking step. However, its implementation in practice remains inadequate. To mark the anniversary, the NGO has therefore joined forces with the Mozilla Foundation and the DSA40 Data Access Collaboratory to launch a coordinated "mass request" to the operators of very large online platforms. These include Facebook, Instagram, X, TikTok, YouTube, and LinkedIn.
According to a statement from the initiators, the aim of the campaign is to obtain a daily overview of the most viral posts in each EU member state. This data should enable civil society organisations to quickly identify which content – with a focus on disinformation or harmful narratives – is being pushed the most by the platform algorithms and therefore potentially has the greatest impact on public discourse.
The legal basis for this requirement can be found in the DSA itself: The regulation obliges very large platforms to make public data available for research "without undue delay". Despite this clear statement, according to Oliver Marsh, who is responsible for technology research at AlgorithmWatch, there is a gap between aspiration and reality. The Platform Act came into force in November 2022, although transitional rules applied to its applicability.
Requesters emphasise willingness to sue
The NGOs involved criticise three main problems: many large tech companies, especially X, regularly refused to provide the requested information. This concerns, for example, a project on non-consensual sexualisation tools (NSTs), also known as "nudifying apps". This is software that uses AI to create realistic, sexualised or revealing images of a person, usually without their consent.
In the past, other operators such as Meta (Facebook and Instagram) and TikTok often only provided low-quality data or put up enormous hurdles. The initiators also consider the regular evaluation reports of the platforms, which are also prescribed in the DSA and are actually intended to identify systemic risks, to be useless.
Another point of criticism: current developments such as AI summaries in search engines such as Google have not yet been considered in the DSA. These overviews directly above the results lists draw traffic away from news sources and thus jeopardise the business model of quality journalism.
Videos by heise
The organisations emphasise their willingness to legally challenge any rejection of their data requests by the tech companies immediately. In general, it is important that the DSA exists as an instrument for democratic control. However, the regulation remains a construction site. Marsh hopes that its potential can be better utilised by its next birthday. The effective application of the DSA is crucial considering the growing risks posed by non-transparent algorithms and the convergence of tech CEOs with anti-democratic forces in Europe and the USA.
(uma)