ChatGPT turns man into murderer: Noyb files data protection complaint
A man allegedly murdered two children, ChatGPT claims. This is not true. Noyb submits a data protection complaint.
Typing hands, with symbols and the letters AI floating above them.
(Image: Shutterstock/Poca Wander Stock)
ChatGPT has turned a blameless man into a murderer, at least the chatbot claims that a man from Norway murdered two of his three children and also tried to kill his third child. The Austrian data protection association Noyb (None of your Business) has filed a data protection complaint with the Norwegian Datatilsynet. This is the association's second complaint against OpenAI. “If hallucinations are not stopped, people can easily suffer reputational damage.”
The GDPR contains a principle of data accuracy. You can defend yourself against untruths about yourself and demand a correction, for example. However, OpenAI claims that there is an indication that ChatGPT could make mistakes. Noyb says that this “tiny hint” is not enough and cannot override the law. “However, the legal obligations for data accuracy cannot be circumvented with a disclaimer.”
Videos by heise
In addition, there is actually a right to information, which OpenAI does not comply with. Data subjects have a right to know what data is stored about them. However, the providers of AI models do not usually reveal which training data they have used. Moreover, chatbots are known to hallucinate. This means that they can give out false information about a person at any time without this information being stored or modifiable. Technically, it is only possible to prohibit the chatbot from making certain connections, i.e., answers or topics. Data cannot be deleted retrospectively.
ChatGPT mixes fact with fiction
In the case of the Norwegian, the answer to the question of who Arve Hjalmar Holmen is contained some real facts. These included the correct hometown and the correct number, genders, and ages of the children. What was not true was the part in which ChatGPT made him the murderer of two of his three children and wrote that Holmen had therefore been sentenced to 21 years in prison. Why ChatGPT makes the father a convicted child murderer remains completely unclear.
There have already been similar cases. For example, ChatGPT also turned a journalist who had been a court reporter for a long time into a criminal. The chatbot mixed the person with the cases he wrote about. So there was a connection after all. According to a press release from Noyb, Holmen is concerned: “People think that there is no smoke without fire. The fact that someone could read this content and believe it to be true is what scares me the most.”
Noyb already filed a complaint against OpenAI in 2024 in a relatively small case. It concerned the correction of a person's date of birth. OpenAI explained that it could not correct such data, but only block certain prompts. If you now ask ChatGPT about Arve Hjalmar Holmen, the chatbot no longer responds with the horror story. According to Noyb, this is due to the chatbot's access to the internet and its current search for information on the name. However, Noyb suspects that the training data of the current models still contains the data that led to the incorrect statement.
(emw)