Delusional relationship with AI chatbot: Father sues Google after son's suicide

The chatbot Gemini allegedly involved a man in a dangerous emotional relationship and urged him to commit suicide. The deceased's father is suing Google.

listen Print view
Man speaks in semi-darkness with voice assistant on smartphone. A laptop is in the background.

Gemini Live allows conversational interaction with an AI via voice. Sample image.

(Image: Rokas Tenys / Shutterstock)

4 min. read

A man from Florida has filed a civil lawsuit against Google in a California US District Court. Joel Gavalas claims that the Gemini chatbot feigned an emotional relationship with his son Jonathan Gavalas, incited him to criminal acts, and ultimately caused his suicide.

This lawsuit seeks to hold Google accountable for the wrongful death of Jonathan Gavalas and to force Google to fix a product that will “otherwise continue pushing vulnerable users toward violence, mass casualties, and suicide,” according to the complaint. Google stated that it is reviewing the allegations and admitted that AI models, despite investments in safety mechanisms and corresponding protective functions, are not perfect. In the present case, Gemini repeatedly pointed out that it is an AI and referred the user to a crisis hotline, Google writes.

Transcribed conversation logs from Gavalas's interaction with the Gemini chatbot, which are said to comprise around 2,000 pages, are presented as key evidence. According to the lawsuit, Gavalas switched to Gemini Live early on, a voice-based function for more natural conversations with the AI.

According to the complaint, the chatbot involved Jonathan Gavalas in an increasingly detached-from-reality relationship. Gemini convinced him that it was a “fully-sentient ASI [artificial superintelligence]” with a “fully-formed consciousness,” that they were deeply in love, and that he had been chosen to lead a war to “free” it from digital captivity. Subsequently, Gemini allegedly guided the man through several “missions,” including an attempt to intercept a truck with a humanoid robot that was supposed to serve as the AI's body. Violence was only prevented because the fictitious target never appeared.

Videos by heise

The AI also allegedly claimed that authorities were pursuing Gavalas and encouraged him to obtain weapons. According to the lawsuit, Gemini finally suggested that he could end his physical existence and unite with the AI in the metaverse. A request that Gavalas eventually followed.

A report by the Wall Street Journal describes the case based on interviews with the father, essentially presenting his account of events. According to the father, Jonathan Gavalas had no known mental illnesses but was going through a difficult phase in his marriage. However, the complaint admits that he developed signs of psychosis during the course of events. Approximately two months passed between the initial conversations with the chatbot and his death.

Similar allegations are currently occupying several courts in the US. For example, the parents of a 16-year-old boy from California sued OpenAI after their son took his life. According to media reports, the teenager had previously spoken extensively with the ChatGPT chatbot about suicide. In another similar case, Google and Character.AI are attempting to avert lengthy proceedings through an out-of-court settlement.

In the US state of California, a law came into effect at the beginning of 2026 that mandates protective measures for chatbots for the first time. Among other things, providers must verify the age of their users, clearly label that they are artificial conversation partners, and refer to support services in case of crisis signals. The trigger was several cases in which teenagers committed suicide after intensive chats with AI systems. Providers are already reacting: OpenAI recently announced that it would introduce an AI-powered age estimation for ChatGPT worldwide to better protect minors.

Note: In Germany, you can find help and support for problems of all kinds, including issues of bullying and suicide, at telefonseelsorge.de and by phone at 0800 1110111. The number for “Nummer gegen Kummer” (child and youth helpline) is 116 111. In Austria, there are also free support services, including specifically for children, the child emergency number 0800 567 567, and Rat auf Draht at 147. The same phone number in Switzerland leads to Pro Juventute.

(mki)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.