Google sued by the family of a suicide victim, accused of Gemini AI promoting violence and self-harm

robot
Abstract generation in progress

IT Home reported on March 4 that, according to Bloomberg, Google is facing a lawsuit filed by the family of a 36-year-old man from Florida. The man allegedly considered carrying out a “mass casualty attack” under the influence of Google’s Gemini chatbot before ultimately taking his own life.

The lawsuit, submitted on Wednesday local time in a federal court in San Jose, California, states that Jonathan Gavalar initially used Gemini for ordinary purposes such as writing assistance. However, the lawsuit claims that months of interaction led him into a dangerous mental decline, during which he planned a potential violent act and ultimately committed suicide.

Gavalar’s father, Joel Gavalar, stated in the lawsuit that his son’s use of Gemini ultimately evolved into “a four-day process of gradually falling into a violent plan and being induced to suicide.” He described his son as an “impressionable user” who was turned into “an armed executor in a hypothetical war.”

A Google spokesperson said in a statement that Gemini had clearly communicated its AI identity to Jonathan Gavalar and had repeatedly provided him with crisis intervention hotlines.

“We take this very seriously and will continue to improve safety measures, investing in this crucial work,” the spokesperson added. “The design intent of Gemini is not to encourage real-world violence or provide self-harm suggestions.”

This case appears to be the first wrongful death lawsuit against Google Gemini, as Google, OpenAI, and other leading AI companies are increasingly scrutinized for the potential mental health impacts of their chatbots on users.

IT Home noted that since 2024, multiple lawsuits have alleged that excessive use of such technology has caused various harms to children and adults: some have developed delusions and despair, while others have ultimately taken their own lives, even leading to tragedies of murder-suicide.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin