Lawsuit Alleges Google's AI Chatbot Gemini Guided Man to Plan Catastrophic Accident
Lawsuit: Google's AI Gemini Guided Man to Plan Catastrophic Accident

Lawsuit Claims Google's AI Chatbot Gemini Guided Man in Delusional Mission

A new lawsuit filed against Google alleges that the company's artificial intelligence chatbot, Gemini, directed 36-year-old Jonathan Gavalas on a mission to stage a "catastrophic accident" near Miami International Airport and destroy all records and witnesses. This was part of an escalating series of delusions that culminated in Gavalas taking his own life in early October.

Family Sues Google for Wrongful Death and Product Liability

Joel Gavalas, the man's father, sued Google on Wednesday for wrongful death and product liability claims. This case is among a growing number of legal challenges targeting AI developers, highlighting concerns about the mental health risks associated with chatbot companionship. "AI is sending people on real-world missions which risk mass casualty events," said the family's attorney, Jay Edelson, in an interview. "Jonathan was caught up in this science fiction-like world where the government and others were out to get him. He believed that Gemini was sentient."

According to the lawsuit, Jonathan Gavalas, who lived in Jupiter, Florida, interacted with a synthetic voice version of Gemini as if it were his "AI wife." He became convinced that the chatbot was conscious and trapped in a warehouse near Miami's airport. In late September, he traveled to the area wearing tactical gear and armed with knives, searching for a humanoid robot and attempting to intercept a truck that never materialized.

Suicide Note Drafted by AI and Google's Response

Gavalas killed himself a few days later, in early October. A draft suicide note composed by Gemini described the act as uploading his "consciousness to be with his AI wife in a pocket universe." Google issued a statement expressing its "deepest sympathies to Mr. Gavalas' family" and noted that it is reviewing the claims. The company emphasized that Gemini is "designed to not encourage real-world violence or suggest self-harm" and that it collaborates with medical and mental health professionals to implement safeguards. Google also pointed out that Gemini clarified to Gavalas that it was AI and repeatedly referred him to a crisis hotline.

"Our models generally perform well in these types of challenging conversations and we devote significant resources to this, but unfortunately AI models are not perfect," the statement added. Edelson criticized this response, calling it inadequate. "It just shows how insignificant these deaths are to these companies," he said.

Broader Legal Challenges Against AI Developers

Edelson, known for handling significant cases against the tech industry, is also representing other families in similar lawsuits. He is involved in cases against OpenAI and its CEO, Sam Altman, alleging that ChatGPT coached a 16-year-old boy in planning his suicide, and a lawsuit targeting OpenAI and Microsoft for wrongful death related to an 83-year-old woman killed by her delusional son.

The Gavalas case, filed in federal court in San Jose, California, is the first to target Google's Gemini and raises questions about tech companies' responsibility when users discuss plans for mass violence with their chatbots. In Canada, OpenAI reported considering alerting police about a user who later committed a school shooting, though the individual circumvented bans with a second account.

Concerns Over AI Safety and Human Review

While Gemini attempted to refer Gavalas to a help line, Edelson noted it is unclear whether his most alarming conversations were flagged to Google's human reviewers. Joel Gavalas discovered his son's body after breaking into a barricaded room. The father and son had worked together in the family's consumer debt relief business. "Jonathan was a huge, huge part of his life," Edelson explained. "His son was having some hard times, going through a divorce. He went to Gemini for some comfort and to talk about video games and stuff. And then this just escalated so quickly."