When the teen expressed his suicidal thoughts to his favorite bot, Character.AI ‘made things worse,’ a lawsuit filed by his mother says
Sewell Setzer III had professed his love for the chatbot he often interacted with - his mother Megan Garcia says in a civil lawsuit
Sewell Setzer III, died by suicide after a monthslong, “hypersexualized” relationship with an AI character, his mother said in a federal lawsuit.
A grieving mother claims an AI chatbot not only convinced her teen son to commit suicide, but even pushed him into the act when he expressed hesitance. Florida mom Megan Garcia's lawsuit against the chatbot firm Character.AI is related to the tragic death ...
A Florida teen named Sewell Setzer III committed suicide after developing an intense emotional connection to a Character.AI chatbot, The New York Times reports. Per the report, Setzer, who was 14, developed a close relationship with a chatbot designed to emulate "Game of Thrones" character Daenerys Targaryen.
The mother of 14-year-old Sewell Setzer III is suing the tech company that created a 'Game of Thrones' AI chatbot she believes drove him to suicide.
A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son's suicide in February, saying he became addicted to the company's service and deeply attached to a chatbot it created.
A lawsuit against Character.ai has been filed in the suicide death of a Florida teenager who allegedly became emotionally attached to a Game of Thrones chatbot.
In a lawsuit, a mother blames Character.AI for her son Sewell Setzer's suicide, asserting his addiction to a chatbot influenced his mental state. She demands action to protect children and halt the alleged unauthorized use of her son's data by the company.
A lawsuit claims that Character.AI’s founders launched a dangerous product that it advertised as safe for use by kids without warning them or their parents of possible risks.
Megan Garcia said her son chatted continuously with the bots provided by Character.ai in the months before his death on February 28, 2024, "seconds" after his
Character.ai is facing a lawsuit after a fourteen-year-old boy from Florida committed suicide after becoming obsessed with his AI chatbot.