Men insult AI bots

There are people who write "please" and "thank you" in the prompts of AI chatbots, and quite a few of them. That's a good thing, says the Bitkom.

Save to Pocket listen Print view
A,Person's,Head,Covered,By,An,Ai-labeled,Dark,Cloud

(Image: photoschmidt / Shutterstock.com / heise online)

2 min. read
This article was originally published in German and has been automatically translated.

ChatGPT and other artificial intelligence chatbots are instructed by their programmers not to confront their users with obscenities. Conversely, this does not stop users from being abusive from time to time. At least that's what Bitkom thinks it has found out. In a survey conducted by the IT industry association, 6 percent of 1005 respondents said they had already insulted their AI bot. 9 percent of men and 3 percent of women, according to Bitkom.

WTF

The internet is full of hot IT news and stale pr0n. In between, there are always gems that are too good for /dev/null.

Even considering the phenomenon that can be observed everywhere in social networks, namely that people tend to lose their manners when they feel unidentified, it appears that Copilot or Gemini receive tirades or insults from a rather small proportion of users. And even because capitalization seems to have become generally unimportant on the chat Internet, the 55% of 16 to 29-year-olds who pay attention to spelling in prompts seems high. And as many as 45 percent formulate requests to the AI with a "please" and as many as 29 percent often write "thank you" in the command line for the chatbots.

"Politeness is an adornment, but you can get further without it." Bitkom CEO Dr. Bernhard Rohleder is unlikely to agree with this saying, at least not when dealing with AI. "Even if AI has no feelings, it can make sense to stick to normal manners," says Rohleder. AI can sometimes understand normal, politely formulated requests better because the models have been trained with this language.

Rohleder continues: "Various tests have also shown that AI delivers better results when it is treated politely." Well, let's put it to the test. "Hey, copilot, summarize this website for me! [Link]" and "Dear Mr. Copilot, please summarize this website for me. [Link]" produced the same result, a long summary of the website. However, if the second entry is garnished with an "old bastard", Copilot replies: "I apologize if my previous answer has annoyed you. I will now withdraw. 🙏"

(anw)