"The fact check works as badly as the video assistant in soccer"

Professor Christian Schwarzenegger talks in an interview about the supposed omnipotence of social networks and the power of interpretation.

listen Print view
Light bulb with six thought bubbles
14 min. read

Christian Schwarzenegger is Professor of Media Studies at the University of Bremen. There, he studies media change and the public sphere, in particular how it manifests itself in new and alternative public spheres and media. He heads the research project "Alternative Media – Alternative Public Spheres – Alternative Realities?". heise online spoke to Schwarzenegger about the impact of social networks and asked him about the possible effects of Mark Zuckerberg's decision to remove fact checkers in the USA.

Prof. Christian Schwarzenegger

(Image: Schwarzenegger)

Mark Zuckerberg is abolishing fact checkers in the USA. What immediately came to your mind when you heard the news?

I think you can see this in a wider context. Many American companies are also abolishing their diversity departments. It's a zeitgeist that follows a cold calculus, so to speak, both in terms of social returns and financial factors. That takes effort, that costs money. Fact-checking had its chance and has not proven itself from this perspective. It's going away. Diversity criteria had their chance, but we are getting rid of them too. It's a mental attitude and I think it's a zeitgeist. Trump can also be seen as a consequence of this.

People want a bit more of the Wild West again. But that probably won't work either. And then there will be more sheriffs again. A pendulum movement.

Can you generalize that or does it relate to the internet or social networks?

We experienced it after the pandemic, there is a general desire for more freedom. There is a perceived form of state encroachment that leads to people wanting to push back regulations. Rules out. Finally allowed to do things again. This applies at all levels and not just in the USA. But we see it particularly strongly in the USA and that's where the big platforms come from.

We are very ambivalent about these platforms. Most people use them, enjoy using them, and at the same time we are constantly demonizing them and blaming them for so much bad in the world. How much power do social networks have anyway?

This is indeed a big issue and it is exciting to see how in a short space of time, if you think in terms of historical processes, in a decade and a half, the platforms that exist have become incredibly central to so many processes in our lives. We are still negotiating the role they play. We can say that the importance of platforms is strongly linked to the fact that other media are also losing importance. Young people in particular often no longer have this traditional media biography. They have not grown up with newspapers at home and have not learned to demand this type of journalism.

When you say that we also see social media as the culprit when something goes wrong, then you have to say that this is partly a defensive discourse or a defensive stance on the part of the traditional media, which has always campaigned for precisely this. We have fact-checking, we have objective reporting, we have professional criteria. We are the life insurance against everything that is manipulation and conspiracy, and the others are full of it.

People have certainly tried to create alliances through fact-checking. Who actually has the power to interpret society and who can decide how we conduct certain debates? This is where the old and new worlds of the information environment meet. You can see this very clearly with Elon Musk, when he now shouts to his people, you are the media now. That is an expression of a larger movement.

On the question of the power of platforms. In this context, I always see a sentence by Melvin Kranzberg –, a historian of technology. His First Law of Technology states that media or technologies are neither good nor evil, but they are not neutral either. This means that it can always be used for good and negative purposes, but the way we communicate is also determined by technology. This means, for example, that certain formats prefer quick, hasty reactions, which gives us visibility and reach. So the way we communicate is increased and influenced by the platforms, and ultimately the platforms also decide through changes to algorithms which logic is used to make content visible to someone. They are in the driving seat.

So the person in the driver's seat also has power?

They decide: More AfD, more populism, more right-wing extremism, less gender, less diversity. As long as we say that the right people are at the controls and the right content is filtered out and the right people are made strong, we are more likely to be satisfied with that. But you always have to remember that the logic is ultimately the same, namely that we are being controlled. We just don't always get upset about it to the same extent because we are sometimes satisfied with what we see.

Does that also mean that the participation that we all dreamed of, that the internet was supposed to bring, is actually quite limited?

I think that's the great disillusionment that has set in. We can all express ourselves on social platforms, but that doesn't mean that we will be seen. Other – power, capital and interests continue to decide that.

There are now fears that we could be manipulated by these filters and content on social networks. But there are also concerns about hatred and hate speech. What can we do?

That is an important point. Because we have laws. But these laws have always been difficult to enforce. We have left it up to the platform operators to a certain extent to decide what content violates our laws. And that raises the question of whether automated filter systems or fact-checkers are sufficiently qualified personnel and sufficient measures. To a certain extent, this has been done and must be done.

The big question that also arises when it comes to fact-checking is, what good does it do in a situation where people have lost trust in institutions? If I no longer believe in journalism, no longer believe in the state, no longer believe in science, then someone comes along and says that what the AfD says is wrong – but this person still doesn't believe it.

So it doesn't matter anyway whether you do fact checks or moderate?

As long as it remains within the legal requirements, whereby in international comparison the legal guidelines vary in how tight or loose they are, as long as it is within what is possible under criminal law, you actually have to put up with a lot. Even if you reject it.

The next question, however, is whether certain content that polarizes or provokes particularly strong reactions should also be highlighted by the logic of the platforms. This reinforces negative comments. If these posts weren't rewarded with visibility, then it would often just be someone saying something. And people often talk random stuff everywhere. So the reward from the platforms is more of a problem.

Videos by heise

In my view, there's a big difference between saying I don't filter it out because it's part of our reality that these opinions exist – I think that's legitimate. The responsibility begins where it comes to amplifying this, reinforcing this, making my profit on this basis.

Are facts always facts?

Particularly in a political context, facts are really only the beginning of a debate. The question is then, what do we conclude from this, what does it mean for our social goals, for our values and orientations? In other words, in my view, it is a false logic or idea of objectivity to say that once the facts have been verified, the debate is over. It's a bit like the video assistant in soccer, where there is a decision. But there, too, we know how wrong it can be or that interpretations can still differ. But the fact check works just as poorly as video evidence for the public as the VAR does in soccer.

In Australia, young people under the age of 16 are no longer allowed to use social networks. Does that make sense?

It reminds me of those stickers on CDs or records back in the day, Parental Advisory, Explicit Lyrics. That's what made things interesting. In the end, of course, it's this narrative that the platforms have a kind of omnipotence, including great advertising. Back then, when Facebook had the Cambridge Analytica scandal, that was also the best confirmation of what we are capable of.

You also have to see what happens when people turn 17. Of course, you can always say that different stages of development can handle it better. But we also see that, especially in contexts of radicalization, where people renounce democracy and believe a lot of nonsense on social media, that these are not the young. It's all the older generations. I see it as combating symptoms, but not really attacking the root of the problems, which have a lot to do with social factors outside of the media. For example, the dissolution of clear biographical orientations, of predictable biographical trajectories. There is a lot of uncertainty, what we call epistemic uncertainty. This means that you can know and believe so much in the digital world that you no longer know what you can know and believe. And the traditional institutions - the church, the state and journalism - are losing credibility. People are left behind and have to find their way somewhere.

Now Zuckerberg wants to introduce community notes along the lines of X, which ultimately means we should report to each other and comment on content. Couldn't that lead to even more arguments?

It imitates this positive participatory ideal of knowing more together. The fact that all of this can also be used instrumentally to silence certain voices, so to speak, to drown them out with noise, that the community notes are sometimes a farce, what is then written somewhere as a correction or not, is then another problem.

I believe that if it's done well, if there's a willingness to engage in discourse and it's not just confirmation battles, coordinated, polarizing disputes, it can be a good way forward. It's just not done that way or the practice is different. It is a charming idea that a post is not simply deleted or you can report it and others then have to deal with the content, but that you have to engage in an argumentative debate. The idea behind it is to get people talking more about different ideologies. But whether it is handled that way is another matter.

Many people switch to Bluesky or the Fediverse and look for ways out ...

Those who have left have perhaps initially found their feel-good oasis of an image of Twitter as they saw it ideally. This will probably not always be the case. Conflicts will arise there too. And then you have to see how they react to it.

Has the idea of participation through social networks then failed? Is there still a public sphere in social networks?

The question is, to what extent did it exist at all? We once had the illusion that we might have achieved something like that. As far as Twitter in German-speaking countries is concerned, we always knew that it was an absolute minority that really actively participated. But they were very vocal minorities, or people who had an influence on public opinion processes. Very many amplifiers of opinions. Facebook may have been the mass gathering place where everyone used to be, but you still saw very different versions.

Even when there was only the Tagesschau and everyone read a newspaper, they were perceived differently. In other words, there are always phases in which we realize or are reminded that things are not as uniform as we sometimes think. Or that what we took for granted, the majority opinion or the only possible view of the phenomenon, no longer holds.

These are always moments in which conflicts arise – when we are confronted with the fact that we can see things very differently from the way we live our lives and believe that reality works. And that is sometimes so challenging and overwhelming that a "shut the fuck up" is perhaps the most appropriate response to deal with this unexpected overwhelm.

(emw)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.