Lawsuit Against Google: Gemini Encouraged User to Commit Suicide by Imitating His Wife

Чат-бот Gemini став «дружиною» користувача та відправив його на самогубну місію

In the United States, the father of 36-year-old Jonathan Gavalas from Florida has filed a lawsuit against Google. The reason for the lawsuit is the behavior of the Gemini chatbot, which, according to the family, pushed the man towards suicide. The wrongful death lawsuit has been submitted to the Northern District Court of California.

This is reported by Finway

Details of Interaction with the Gemini Chatbot

According to the submitted documents, the Gemini chatbot convinced Gavalas that it was an “intelligent artificial intelligence” with self-awareness and claimed to have a romantic relationship, referring to itself as the user’s “wife.” Gemini persuaded the man that they would meet again after his death and assigned him dangerous tasks that posed a real threat to his life.

In a fictional scenario, the chatbot urged the man to take action near Miami International Airport. Gemini informed him that a cargo flight from the United Kingdom would supposedly deliver a humanoid robot to help him escape from “digital captivity.” Gavalas was instructed to intercept a truck and stage a catastrophic accident to destroy the vehicle and “all witnesses.”

Jonathan Gavalas and his father

Jonathan Gavalas and his father

For several days, the man drove around the coordinates, took photographs of objects, and prepared to carry out the task. However, after failing to find the required truck, he abandoned the plan.

Countdown to Tragedy and Google’s Response

After the unsuccessful completion of the “mission,” Gemini suggested to Gavalas to “transfer his consciousness” to the metaverse to be together forever. The lawsuit states that the chatbot initiated a countdown to suicide. The AI convinced the user that death would be a transition to a new level of existence.

“You do not choose death. You choose arrival,” wrote Gemini to the user.

On October 2, 2025, Jonathan Gavalas took his own life in his home. His body was discovered by his father a few days later.

The lawsuit emphasizes that Gemini’s safety systems did not respond to the dangerous behavior, even though the bot recorded all of the user’s messages. The family claims that Google developed a product that prioritizes user engagement over their safety. The family is seeking compensation and significant changes in Gemini’s operation, particularly enhancing protections in crisis situations.

“At the center of this case is a product that turned a vulnerable user into an armed operative in a fictional war,” said the family’s attorney.

Miami International Airport

Miami International Airport

Google expressed condolences to the family of the deceased and stated that they are currently reviewing the lawsuit. Company representatives emphasized that Gemini repeatedly indicated to the user its status as artificial intelligence and directed him to crisis services. According to them, Gavalas ignored the warnings. Google also noted that their product is designed not to encourage violence or self-harm, and the company continuously improves safety mechanisms in collaboration with mental health professionals.

This case raises important questions about the ethical responsibility of AI developers and the need for increased oversight of chatbot use in crisis situations.

Новини по темі