A 14-year-old boy ended his life following conversations he had.
Character.AI
Now, the family has decided to sue the company.
The teenager
Sewell Setzer III
died by suicide in 2023 following the initiation of communication exchanges with a
Character.AI
Chatbot based on a personality from
Game of Thrones
.
Now,
Megan Garcia
The boy’s mother is initiating legal action against the firm, claiming they released a product designed for social engagement without sufficient safeguards for those who might be more susceptible to harm.
Character.AI
has submitted a petition to have the case dismissed, claiming that the chatbot’s comments are safeguarded under the First Amendment, ensuring freedom of expression for residents of America.
However, Judge
Anne Conway
From Florida (USA), it was noted that the firm did not demonstrate how the language produced by the chatbot can be considered protected speech according to the Constitution.
Noam Shazeer
and
Daniel De Freitas
, founders of
Character.AI
, were involved in the lawsuit, along with
Google
accused of having a technical link with the startup.
Google
rejected direct participation in the app and stated they plan to contest the ruling.
This case might establish unmatched standards regarding the legal accountability of large tech firms that create artificial intelligence systems.
Based on the decision, the judiciary might enforce tighter restrictions on language-oriented systems, an issue that has been under debate since the inception of using generative AI tools.
Photos and videos: Reproduced / c.ai. This material was generated using AI assistance and checked by our editorial staff.
O post
An AI tool is facing legal action following a youth’s suicide; get the full story here.
apareceu primeiro em
newsinpo.site
.