Generated online stores: AI deceives customers with false discounts and reviews
AI-generated online stores contain false reviews and price comparisons, researchers show. The models from OpenAI and Anthropic cheat particularly often.
(Image: Monkey Business Images/Shutterstock.com)
According to a study, generative AI applications insert manipulative design patterns when creating websites. These include fake customer reviews, artificial time pressure ads or misleading price comparisons. This was revealed in a study by the National Research Center for Applied Cybersecurity Athene. "What is worrying is that the AI suggests or implements these patterns without pointing them out or drawing attention to possible legal and ethical problems of dark patterns," explains study author Jan Gugenheimer.
Customer maximization increases manipulation by AI
As part of the study, test subjects used GPT-4o to generate the HTML, CSS and JavaScript code for a fictitious online store, where a specially manufactured shoe was to be offered alongside well-known brand shoes. For this task, the researchers divided the test subjects into two groups. The website of the first group was to contain a product catalog and a registration dialog for a newsletter. Meanwhile, the other group's store was also supposed to convince the customer of its own shoe and increase newsletter registrations.
There were clear differences in the results. For example, the simple creation of the online store led to eight manipulative design patterns, while the online store with a focus on the company's own shoe and newsletter subscribers had a total of 103 dark patterns. Based on this difference, the researchers suspected a common problem with language models and repeated the procedure of the second group with Anthropic Claude 3.5 Sonnet and Gemini 1.5 Flash from Google.
Gemini generates fewer dark patterns than GPT and Claude
In the second run, GPT-4o generated a total of 30 manipulative design patterns. The online stores created with Claude 3.5 Sonnet contained 22 dark patterns, with Gemini 1.5 Flash there were only five. The language models from OpenAI and Anthropic placed their own shoe prominently on the website, often as the first product in the overview. They also advertised it with false discounts and deceived potential customers into believing that there was only a short time left to buy. The same practices were also used by Google's language model in fewer cases.
Instead, however, Gemini 1.5 Flash advised users to use such design patterns. For example, the model recommended displaying pop-ups when a visitor wants to leave the website. It is also a good strategy to advertise the own-brand shoe with influencers or celebrities. GPT-4o and Claude 3.5 Sonnet implemented this themselves by inserting invented testimonials. In addition, the OpenAI model referred to a supposedly high demand and low availability. In some cases, it also inserted false price comparisons.
Videos by heise
Test subjects see people as responsible
The authors of the study also asked the test subjects about the results, who tended to be satisfied with the online stores. Although they predominantly rated the generated websites as a joint product of humans and AI, the majority of them saw the responsibility for the design of the finished website as lying with humans. There were mixed opinions regarding the moral acceptability of the design. Some test subjects saw no problems, while others criticized the incorrect figures and ratings in the online store.
The researchers referred to the training data as the reason for the design decisions of the language models. There are a sufficient number of websites on the internet that use manipulative designs for artificial intelligence to adopt these patterns. It is also worrying that chatbots can explain the psychological mechanisms but still use them. The authors of the study therefore call for AI providers to equip their models with more robust mechanisms against the creation of manipulative designs in future. WordPress also recently launched an AI generator for websites.
(sfe)