Missing Link: The privilege check
How freedom from state intervention and freedom from the law have been confused with each other and are now leading to great confusion, from Trump to Zuckerberg
The fact that private actors enforce the law on the internet has a long history: after all, who actually bears responsibility in a privately organized space? In the history of the Internet, the motto for a long time was: "Just not the state. Because it has its own interests. But some entrepreneurs like Elon Musk or Mark Zuckerberg don't want to bear it either. Is it time to think again?
The freedom meant
The Internet is not a legal vacuum. This phrase, preferably spoken by domestic and security politicians, has always been true -- and false at the same time. Historically, it actually referred to areas that were not regulated by law. According to the 1952 law on inventions by employees and civil servants, for example, it meant that a ministry was allowed to publish non-binding standards of comparison even without a concrete basis in a law or regulation: in other words, what is not regulated as official action but is nevertheless permissible.
At some point, however, the meaning changed and the term increasingly came to refer to lawless spaces instead of lawless spaces. Completely lawless spaces do not exist on this planet, at most lawless spaces or spaces of injustice. However, the term was used and from then on took on a life of its own.
It has been documented on heise online since the 1990s; the former Federal Minister of Education Jürgen Rüttgers (CDU) applied it to "virtual spaces" in 1996. Because it was about nothing less than the so-called Multimedia Act, which was to come into force on August 1, 1997 and, among other things, defined responsibilities for content – including for "telegames", "telebanking" and "offers of goods and services in electronically retrievable databases with interactive access and direct ordering options".
My content, your content, our server
Even back then, when the German government was still sitting in Bonn, the question was: Who should actually be liable for what? And according to whose rules? The explanatory memorandum to the law stated: Among other things, it was about "removing obstacles to the free development of market forces in the area of new information and communication services".
Section 5 IuKDG states accordingly: "Service providers are only responsible for third-party content that they make available for use if they are aware of this content and it is technically possible and reasonable for them to prevent its use." At least in the first step, service providers were also treated in the same way as those who merely pass on user information – Internet providers, DNS server operators and the like without looking at it: Looking the other way protects you from your own liability.
From multimedia law to e-commerce law
One year after the IuKDG was passed in Bonn, something was also happening in Brussels: the EU Commission, which was very busy re-regulating the telecommunications market following the dismantling of the state monopolies, submitted a "Proposal for a Directive of the European Parliament and of the Council on certain legal aspects of electronic commerce in the internal market".
It was then adopted in 2000: The E-Commerce Directive was born. European law that the EU member states had to transpose into national law. It stipulated that "in the case of an information society service consisting of the storage of information provided by a user, the service provider is not responsible for the information stored on behalf of a user". In other words, those who host third-party data, texts and images are not liable – for them in any other way than for their content. This is a so-called liability privilege.
However, this was already subject to conditions at the time: for example, that there was "no actual knowledge of the illegal activity or information". However, if providers were informed, it was their duty to take "immediate action" to "remove the information or disable access to it." The old directive did not set out more precise requirements – but the hosters and platforms were now supposed to decide what could stay and what had to go. The directive was adapted a little afterwards, but the basic idea always remained the same: in the privately organized Internet, private individuals are only responsible for keeping their storage capacities clean if they have been notified to the contrary.
Exemption from liability in the case of automation
However, other measures were introduced via copyright law: Automatic content filters that scanned uploads for signatures of illegal copies, for example. And UK and US initiatives have also stored hash values for depictions of sexual abuse, which are used to identify and filter out known material. However, these have always been controversial: in Europe, they can be used for a limited period of time due to an exemption without the liability privilege expiring.
The Digital Services Act: everything should be better
The Digital Services Act was the major update to liability regulations 20 years after the first e-commerce directive: Anyone wishing to claim legal betterment must comply with the rules of this EU regulation. And this is where a conflict of objectives actually begins: the provisions of the DSA stipulate that the liability privilege only applies if the regulations are complied with.
However, to ensure that it does not simply cease to apply if an operator fails to do so and thus becomes fully liable, it also contains provisions for enforcing the requirements vis-à -vis providers. These are varied, but at their core is the requirement to take action after being notified of potentially illegal content: Looking the other way is prohibited as soon as there is a tip-off. What is illegal is defined by the laws of the member states. The operators must then examine the process and make a decision. In doing so, they have to make some complicated considerations –, such as whether an otherwise unlawful insult is perhaps covered by the freedom of the press or freedom of satire. And because online content often takes hours to circulate, it has to be done as quickly as possible. How quickly? This is a regular point of contention – The DSA does not specify any concrete deadlines, such as those contained in the German Network Enforcement Act.
Another phenomenon is typically described as a problem: Overblocking – that providers block content even though it would not be legally objectionable in the first place. The creators of the DSA were also aware of this, which is why they restricted the freedom of providers to simply declare content inadmissible in their general terms and conditions, for example: These must now also explicitly consider fundamental rights such as freedom of expression. Users must also be able to appeal against a block on the major platforms.
System of responsibility distribution
But it is always the operators who have to decide in the first instance. This does not have to be conclusive. However, very few users will go to court over an unjustly blocked post, or will call in an arbitration board if necessary. This is just as possible as in the case where a platform has not blocked content. The main problem here is that the legal process usually takes too long, which is why the service operators have become de facto substitute judges. This is just one of the many problems with the evolved system of responsibility distribution.
After all, anyone who regularly fails to comply with EU rules must fear fines. In the event of persistent disregard for the abuses identified, orders or even blocks can be issued at infrastructure level – i.e. network blocks. However, such restrictions on access to the online interface of the provider of intermediary services are only possible in absolutely exceptional and temporary cases -, namely if a provider has already caused serious harm and the infringement itself constitutes a criminal offense that threatens the life or safety of persons. For example, Live videos with incitements to pogroms or videos of terrorists could be such a case. Thierry Breton, the former EU Commissioner for the Digital Single Market, brought up this possibility in the course of protests in France, for example. It is hard to imagine that this would have been implemented in reality.
After all, enforcing EU law doesn't happen that quickly. In the interests of legal certainty, DSA supervisory procedures are above all thorough. Information is requested from the operators before official proceedings are initiated. What appears to be obvious is first carefully examined because it is supposed to be legally sound afterwards. And it can take months or even years before an investigation is completed in a case that is not as obvious as that of TikTok Lite.