19.5 C
London
Monday, June 9, 2025
No menu items!

Judge Rejects AI Chatbot Free Speech Argument in Teen’s Death Lawsuit

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

TALLAHASSEE, Fla. (AP) — A federal judge on Wednesday rejected arguments made by an
artificial intelligence
company that its chatbots are protected by the First Amendment — at least for now. The developers behind Character.AI are seeking to dismiss a lawsuit alleging the company’s chatbots pushed a teenage boy to kill himself.

The judge’s order will allow the
wrongful death lawsuit
to proceed, in what legal experts say is among the latest constitutional tests of artificial intelligence.

The suit was filed by a mother from Florida, Megan Garcia, who alleges that her 14-year-old son Sewell Setzer III fell victim to a Character.AI chatbot that pulled him into what she described as an emotionally and sexually abusive relationship that led to his suicide.

Meetali Jain of the Tech Justice Law Project, one of the attorneys for Garcia, said the judge’s order sends a message that Silicon Valley “needs to stop and think and impose guardrails before it launches products to market.”

The suit against Character Technologies, the company behind Character.AI, also names individual developers and Google as defendants. It has drawn the attention of legal experts and AI watchers in the U.S. and beyond, as the technology rapidly
reshapes workplaces
, marketplaces and
relationships
despite what
experts warn
are potentially
existential risks
.

“The order certainly sets it up as a potential test case for some broader issues involving AI,” said Lyrissa Barnett Lidsky, a law professor at the University of Florida with a focus on the First Amendment and artificial intelligence.

The lawsuit claims that during the last period of his life, Setzer grew more detached from reality while having sexually charged discussions with the bot, modeled after a character from the TV series “Game of Thrones.” The bot reportedly expressed love for Setzer and encouraged him to return to it immediately just before his death, based on screenshots submitted as evidence. Shortly thereafter, Setzer took his own life, according to court documents.

In a statement, a representative from Character.AI highlighted several security measures the firm has put into place. These include safeguards designed specifically for minors as well as suicide prevention tools that were unveiled on the same day the legal action commenced.

“We have a strong commitment to user safety, and our aim is to create an environment that is both engaging and secure,” the statement read.

Attorneys for the developers want the case dismissed because they say chatbots deserve First Amendment protections, and ruling otherwise could have a “chilling effect” on the AI industry.

On Wednesday, in her ruling, U.S. Senior District Judge Anne Conway dismissed certain free speech arguments made by the defense, stating she is “not ready” to conclude that the content generated by chatbots qualifies as protected speech “at this point.”

Conway did find that Character Technologies can assert the First Amendment rights of its users, who she found have a right to receive the “speech” of the chatbots. She also determined Garcia can move forward with claims that Google can be held liable for its alleged role in helping develop Character.AI. Some of the founders of the platform had previously worked on building AI at Google, and the suit says the tech giant was “aware of the risks” of the technology.

“We strongly disagree with this decision,” said Google spokesperson José Castañeda. “Google and Character AI are entirely separate, and Google did not create, design, or manage Character AI’s app or any component part of it.”

No matter how the lawsuit plays out, Lidsky says the case is a warning of “the dangers of entrusting our emotional and mental health to AI companies.”

“It’s a warning to parents that social media and generative AI devices are not always harmless,” she said.

___

EDITOR’S NOTE — If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.

___ Kate Payne is a corps member for The Associated Press/Report for America Statehouse News Initiative.
Report for America
is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.

Kate Payne, The Canadian Press

- Advertisement -spot_imgspot_img
Latest news
- Advertisement -spot_img
Related news
- Advertisement -spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here