Bundestag election: Regulators take X, TikTok, Meta & Co. to task
The Federal Network Agency invited platform operators to a round table with the EU Commission in Berlin. The aim was to protect the integrity of the election.
In Germany, there is great concern about foreign influence and targeted disinformation via social networks ahead of the general election. This is not only due to further reports of Russian propaganda campaigns such as Operation Doppelgänger, but also due to the new moderation policy at Meta and its platforms Facebook and Instagram, which is aligned with US President Donald Trump. The Federal Network Agency therefore invited major operators to a round table in Berlin on Friday together with the EU Commission. During the discussion, the hosts said they "emphasized the responsibility of the very large online platforms and search engines in the run-up to elections and the corresponding obligations under the Digital Services Act (DSA)".
In this country, the regulatory authority acts as the Digital Services Coordinator (DSC), which is responsible for the national enforcement of the DSA. According to the Platform Act, services with over 45 million active European users must, among other things, analyze, evaluate and, if necessary, reduce systemic online risks relating to the integrity of elections. The aim of the meeting was to discuss possible violations of the DSA that could occur in connection with the federal elections in February 2025, as well as risk-minimizing measures among all stakeholders involved. Representatives from Google (YouTube), LinkedIn and its parent company Microsoft, Meta, Snapchat, TikTok and X as well as national authorities and civil society organizations (NGOs) were present.
"We take the very large online platforms at their word that they want to and will implement the requirements of the Digital Services Act with commitment", emphasized Klaus Müller, President of the Federal Network Agency and acting head of the DSC, after the round. "We are working closely with the EU Commission and are monitoring developments very closely in the run-up to the federal elections – together with other national authorities." Any violations of the DSA are forwarded to the responsible executive authority in Brussels. This is the case, for example, if "illegal content is not deleted or accounts are wrongly blocked".
EU Commission has published guidelines
Last year, the EU Commission issued recommendations for measures that service providers covered by the DSA should take to mitigate systemic online risks to the trustworthiness and reliability of elections. According to the guidelines, operators must also respect fundamental rights – including the right to freedom of expression. One aspect relates to risks associated with generative artificial intelligence (AI) such as ChatGPT. Accordingly, the platforms concerned must clearly label or otherwise conspicuously mark "artificial or manipulated images, audio or video content that markedly resemble existing persons, objects, places, entities or events".
During an ongoing election, operators are required to provide users with access to "reliable, up-to-date and understandable information from official sources about the voting and the voting process". This is intended to help reduce potential damage caused by serious problems such as manipulated images, votes or deepfakes, for example by political actors. Attempts to "use disinformation and information manipulation to suppress voters" must be prevented.
In the USA, the course under Trump is going in the opposite direction. Meta has followed in the footsteps of Elon Musk and his service X and kicked fact-checkers out. In Europe, however, they are to remain in use for the time being. Trump himself compares the fight against disinformation to censorship, and Meta boss Mark Zuckerberg has jumped on the bandwagon. However, this week, for example, a search for the hashtag #democrats on Instagram apparently returned no results for a while.
Videos by heise
"Stress test" with operators also for the federal election
At the end of April, the EU Commission also carried out a "stress test" with very large platform operators for the first time in order to assess the instruments and cooperation mechanisms in place with a focus on the upcoming European elections. A repeat is now to focus on the German parliamentary elections. Microsoft, TikTok, LinkedIn, Google, Snap, Meta and X have been asked to take part in this audit on January 31, a Commission spokesperson explained on Friday. As part of this action, potential scenarios "in which the DSA comes into play" would be examined together with German authorities. The focus will be on how the operators react to these specific scenarios. According to Reuters, TikTok was the first service to confirm its intention to take part in the test.
On Wednesday, Federal Minister of the Interior Nancy Faeser (SPD) had already met with representatives of large social media platforms and digital companies, the responsible federal ministries and security authorities as well as NGOs. Federal Chief Electoral Officer Ruth Brand also took part in the meeting. The focus here was on measures against targeted disinformation campaigns and hate speech, such as death threats, as well as the labeling of political advertising and deepfakes.
Faeser: X & Co. must comply with law and order
"Lies and propaganda are instruments that Russia in particular uses to attack our democracy," Faeser reported afterward. It was also important to "protect candidates from criminal acts online, including death threats". Such statements could lead to real violence. The Minister emphasized: "When people are threatened, democratic debates are no longer possible."
According to Faeser, it was important to remind the operators of their duty "in view of the current discussions" that they must "abide by the law, which was democratically decided in Europe". In addition to the DSA, the regulation on the deletion of terrorist online content and the regulation on political advertising are also relevant. The review of criminal content "must be strengthened and must not be restricted", demanded the Social Democrat. What is also needed is "more transparency about the algorithms so that they do not fuel dangerous radicalization processes, especially among young people".
(nie)