AI text not protected speech: US trial on suicide over chatbot admitted
In the USA, the case of a teenager who took his own life after AI chats may continue. AI texts are not covered by freedom of speech.
Character.ai promotes AI that "feels real"
(Image: Character.ai)
In the US state of Florida, a federal judge has rejected an AI company's argument that statements made by its chatbots are protected by the 1st Amendment of the US Constitution as free speech. The AP news agency reports this and explains that the case concerns the suicide of a 14-year-old whose mother claims that the boy had previously been drawn into an “emotionally and sexually abusive relationship” by an AI. The chatbot came from Character.ai. The company was founded by former Google engineers. The search engine company licensed the technology and is also one of the defendants in the case.
Generated texts not protected speech “at this time”
As AP summarizes, the mother of the deceased explained that in the last months of his life, he had become increasingly isolated and had sexualized conversations with a chatbot from Character.ai pretending to be the character Daenerys Targaryen from the TV series Game of Thrones. In the end, the text generator told him that it loved him and asked him to “come to my house as soon as possible”. The teenager then asked: “What would you say if I came home right now?”. “Please do it, my sweet king”, the chatbot then replied, whereupon the boy shot himself.
Videos by heise
While the mother's lawyers accuse those responsible for having created a highly addictive and dangerous product aimed specifically at children, the AI company responsible for the chatbots, Character Technologies, disagrees. They are very concerned about the safety of the users, they said. In court, the company argued that the chatbots' statements would enjoy extensive protection as freedom of expression. The judge responsible has now rejected this and stated that she is not prepared “at this time” to define the chatbots' output as speech, AP quotes.
At the same time, however, she has also recognized that users of the service can record their texts as “speech”. The judge has now made it possible for the legal dispute to be pursued further in court. Google can therefore also continue to be sued. The company cooperates closely with Character.ai and has stated that it does not agree with the decision. They are two separate companies and the chatbots were not developed by Google. According to the report, the case is being followed with great attention in the USA, partly because it is the next case in which an aspect of the legality of AI technology is being fundamentally reviewed.
Note: In Germany, you can find help and support for problems of all kinds, including questions about bullying and suicide, at telefonseelsorge.de and by calling 0800 1110111. The number against grief (Kinder- und Jugendtelefon) is 116 111. In Austria, there are also free help services, including the children's helpline on 0800 567 567 and Rat auf Draht on 147, especially for children. The same telephone number in Switzerland leads to Pro Juventute.
(mho)