How AI summaries limit civil society diversity
When chatbots and Google determine the answers, NGOs lose visibility. The concentration of power on a few platforms threatens democratic diversity.
(Image: YueStock/Shutterstock.com)
The popularity of chatbots with large AI models is changing the way people search online: The supposedly most important information is available in direct dialogue with the “AI” or the automated summaries in search engines. This keeps people on the platform. The actual experts on a topic are relegated to a footnote or remain completely invisible. Civil society diversity is threatened by this development.
When was the last time you searched for something without clicking on one of the hits on the search results page and visiting one of the websites listed? Your answer is “today”? You're in good company: according to a study, more than every second search now ends without a click. The “zero-click search” phenomenon is growing rapidly.
Or are you on Google much less often because answering your questions via a chatbot such as ChatGPT or Co-Pilot is much more convenient or somehow more fun? You are not alone in this either: Google is still the global market leader but has slipped below the 90 percent share for the first time in many years.
The introduction of “AI” summaries must also be understood in this context: Google lists central “AI”-generated answers directly at the top of the results page in an attempt to demonstrate that the corporation can also do “AI”.
Search engine optimization (SEO) experts and publishers who are financially dependent on website views are watching the trend with concern. They are optimizing texts to at least be mentioned as a source in the sidebar of the “AI” summaries, or are starting with so-called Generative Engine Optimisation (GEO). The aim of this is, for example, to ensure that language models at least cite their sources, which they have taken from the internet anyway.
However, this does not change the fact that the number of clicks on websites following search queries has fallen drastically.
Videos by heise
Search engines have been contributing to this trend for some time by placing much more detailed information directly on the results page. Info boxes at the top are now standard. They also reduce the number of clicks because the central question that a person has searched for has already been answered on Google itself. This saves users time.
Platforms and concentration of power
The power of large platforms and their regulation has been the subject of increasing debate for years. The focus has tended to be on social networks, above all Facebook and Instagram, but also TikTok, X, and LinkedIn.
There, too, people should stay on the platform. The common tip is to avoid links to increase the reach of a post. The algorithm does not like the prospect of someone leaving the platform, even for a short time.
There is also a debate whether it is socially acceptable for posts to be evaluated and displayed according to the rules of the respective platform. Their judgement alone determines whether, for example, sexual education work is offensive or where the boundary lies between freedom of expression and misanthropy.
Another aspect, however, has hardly been recognized by the public: namely, that Google is also a social network in that signals from users play a part in deciding which search queries are listed high up. If most people return to Google a few seconds after visiting a page, this is often considered a bad sign from a technical perspective, and the page loses visibility.
Google alone decides what is considered the most important in the “AI” summary.
Google search engine is itself a social network
The power of large platforms and their regulation has been discussed for years, mostly regarding social networks such as Facebook, Instagram, or TikTok. But Google also follows the same logic: people should increasingly stay on the platform. This is why clicks on external pages are not encouraged but replaced by info boxes or now by AI summaries.
Google has long been more than just a search engine. It functions like a social network: the behavior of users determines which content is displayed at the top, which is pushed aside, and which disappears completely. Only content that fits into the ecosystem and increases dwell time has a chance of visibility.
Google therefore also performs an editorial function—without transparency and without democratic control. What appears to be relevant in the AI summary has no neutral truth value but is an excerpt filtered by algorithms and also shaped by business interests.
What is shown is political
You can approach this question in terms of content. If you google “safer sex,” the AI summary lists “condoms,” “dental dams,” “avoiding bodily fluids,” and “clean sex toys” as strategies.
Two other methods, however, are missing, although they both provide reliable protection against HIV: PrEP, i.e., the preventive use of HIV medication, or the protective effect of HIV therapy. Deutsche Aidshilfe sees them as a key to dealing with the HIV pandemic.
Tonality is often decisive in sexual education: how strongly do you focus on possible diseases in sex education? And how can you simultaneously convey how important it is for everyone involved to feel comfortable during sex?
In the case of illegalized substances such as heroin, cocaine, or cannabis, Google largely dispenses with “AI” summaries instead of actively pointing out possible harm minimization and safer use measures.
What is displayed and how texts are formulated is no coincidence but a result of probabilities and algorithms that are largely developed without public control. Here, Google, ChatGPT, and co. exert content-related and often political influence with every single search query and every single chat history.
Why do civil society organizations need visibility?
Civil society organizations look at the issue from a broader perspective. NGOs are dependent on public or private funding. Their work is evaluated using success indicators, including the reach of websites. Why should you promote something if nobody is supposedly reading the content? Or no one uses their services: A visit to the website is often only the first step taken by people seeking advice to get to know an organization and seek help.
Advice services are perceived according to content-related search queries and then often go into more depth than chatbots or “AI” summaries ever could. The actual issue sometimes only emerges after you have been on the phone or writing for a while. With empathy, patience, and sensitive questions. To achieve this, however, people first have to find the services.
Some organizations are also dependent on donations, especially considering the tighter budgets of public donors. People wanting to donate first go to the organization's website. Registration for the newsletter is not done via Google or ChatGPT, but via the respective online presence. The end-of-year fundraising campaign needs visibility to have an impact.
Larger civil society organizations in particular certainly have a good chance of meeting the challenges: Their communications teams are vying for the remaining attention. They use advertising budgets to woo donors. Their domains are considered having greater authority, which still gives them a better chance of achieving reach than new or small NGOs will ever have.
New networking is crucial
The even greater concentration of power in a few platforms through AI threatens civil society diversity. Platforms must be obliged to make their sources visible and clickable in AI summaries—and this also includes smaller organizations. Separate summaries should be omitted for sensitive topics or at least coordinated with relevant civil society and scientific actors. Transparency regarding the selection criteria and independent oversight are also needed to ensure democratic diversity.
Manuel Hofmann is a consultant for digitalization at Deutsche Aidshilfe.
(mki)