The Russmedia ruling of the ECJ: Towards a “Cleannet”?

A change in liability privilege for online providers will lead to a “cleaner”, but also more rigid, monitored internet, says Joerg Heidrich.

listen Print view
Justitia in front of a bookshelf.

(Image: nepool/Shutterstock.com)

9 min. read
By
  • Joerg Heidrich
Contents

The Russmedia ruling of the ECJ (C-492/23) of December 2, 2025, has far-reaching consequences for online platforms in Europe. What initially appears to be a technical decision on data protection responsibility could fundamentally change the liability privilege for online providers in the area of user-generated content. It may even herald the end of anonymous communication on the net.

At the heart of the legal dispute is a person whose life was massively affected by a malicious advertisement. As early as 2018, an unknown third party published an advertisement on the website www.publi24.ro. This is an online marketplace where users can offer goods and services. The advertisement in question falsely claimed that the person concerned was offering sexual services and used real photographs as well as their private telephone number. From a legal perspective, the published information constitutes so-called special categories of personal data according to the GDPR, which are particularly sensitive and therefore particularly protected.

The provider reacted immediately to the request for deletion and took the content offline within an hour. The person concerned was not satisfied with mere deletion and took the case to the Romanian courts. The court of first instance followed the plaintiff's argument and ordered the provider to pay 7000 Euros in non-material damages. On appeal, the tables turned. The court overturned the judgment and acquitted the operator of all liability. The case eventually landed at the Cluj Court of Appeal. This court was confronted with a conflict of norms that is significant for the entire European digital economy: How does the liability privilege for providers of third-party content, which is currently regulated in the Digital Services Act (DSA), relate to the strict responsibility for data processing in the GDPR? To clarify this question, the court suspended the proceedings and referred several questions to the ECJ for a preliminary ruling.

Videos by heise

With the DSA, or its predecessor, the E-Commerce Directive, the European legislator codified, among other things, the prohibition of general monitoring obligations (Art. 8 DSA). The political message was clear: Europe wants secure platforms, but not censorship machines. However, this is contradicted by a provision in the GDPR that has received little attention for a long time (Article 2(4)). This states that the GDPR does not affect the application of the E-Commerce Directive, in particular its provisions on intermediary liability. The ECJ resolves this conflict of values in its decision very unilaterally in favor of the provisions in the GDPR. And it does so without regard for losses.

The core legal statement is that not only the person who published the content of the advertisement is responsible for it. Within the framework of so-called joint responsibility, the ECJ also holds the website operator responsible. The Court justifies this by stating that the advertisement is only published on the internet and thus made accessible to internet users thanks to the online marketplace. Russmedia not only provided storage space but also organized, stored, and distributed the data. This argument means that any host can be held responsible for data completely unknown to them that third parties have uploaded to the platform. The previous dividing line between host and content provider, from which liability arises, is thus largely dissolved.

From this finding and the resulting potential liability of website operators, the ECJ derives three newly formulated obligations for providers of third-party content:

Pre-screening: To the extent that sensitive data within the meaning of Art. 9 GDPR is involved in published content, there is a proactive obligation to screen before publication. This applies to information about sexual orientation as well as political or religious views, information about ethnic origin, or information about illnesses. In practice, therefore, every new publication will likely have to be checked, if only to determine whether such sensitive data is contained therein.

Identity Requirement: For publications that may contain sensitive data, the operator must ascertain the identity of the user and their contact details. Anonymous publications are therefore only possible to a limited extent. How long this information must be stored is not stated in the decision.

Technical Protection: There is an active obligation to technically impede the copying of sensitive data by third parties. How this should look in practice is left open by the ECJ.

Perhaps the most serious social consequence of the ruling is the de facto end of anonymous forum use or classified ad posting. The ECJ formulates a clear directive: In the case of sensitive data, which is anything but rare, the operator must check whether the advertiser is actually the person whose data is contained in such an advertisement. If this is not the case, they must provide proof of "express consent" from the affected person. In order to determine whether the advertiser is identical to the person in the content, the operator must necessarily know and check the true identity of the poster. Anonymous or pseudonymous use is incompatible with this duty to check.

In practice, this leads to absurd consequences, for example in the case of a forum: If a user wants to write about their depression in a forum, this constitutes sensitive health data. The forum operator must now ensure that the user is actually writing about themselves and not about a third party. To do this, they must know who the user is and ideally request proof of identity. In the extreme case, they would even need proof that the user actually suffers from depression – they could also be writing about third parties.

However, this information about the user also constitutes sensitive data within the meaning of the GDPR. In order to prevent possible data misuse, as in the original case, the provider must therefore, in turn, collect larger amounts of sensitive data, which is likely to do data protection more of a disservice.

The ECJ argues that Article 2(4) of the GDPR should be interpreted to mean that the liability privileges of the DSA apply to other legal areas, e.g., copyright or trademark law, but not to data protection. This leads to a situation for platform operators that is difficult to implement: If a user posts an insult, for example, the operator must delete the post according to the DSA only after becoming aware of it, for example, through a report. They do not have to filter in advance.

If the otherwise identical comment contains the name and political views of the insulted person, i.e., sensitive data according to Art. 9 GDPR, the operator must identify this comment before publication, check the identity of the poster, and, in the absence of the victim's consent, prevent publication. If they fail to do so, they face their own liability regardless of knowledge of the post.

And that's not all, the ruling holds another obligation for service providers: the ECJ also requires operators to take measures to prevent the copying of advertisements by third parties.

The C-492/23 ruling is a Pyrrhic victory for data protection. Above all, however, it represents a defeat for the free, open internet. The ECJ has clearly set priorities: the protection of personality rights weighs more heavily than the freedom of communication of the general public. The result is a paradox: in order to protect users' data, platforms must collect more data about users and monitor and filter their content more intensively than ever before.

For hosts of user-generated content, this means the end of innocence. The consequence will be a market cleanup: small, open forums will disappear or transform into closed groups, over which a liability risk looms. Large platforms will raise their walls and strictly control identities. The internet will become "cleaner", but also more rigid, monitored, and less anonymous.

(mack)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.