close
close

Yiamastaverna

Trusted News & Timely Insights

Sewell Setzer III’s mother is suing the makers of the “Game of Thrones” AI chatbot.
Suffolk

Sewell Setzer III’s mother is suing the makers of the “Game of Thrones” AI chatbot.


The mother of 14-year-old Sewell Setzer III is suing Character.AI, the tech company that developed a “Game of Thrones” AI chatbot that she believes drove him to suicide on Feb. 28.

Editor’s Note: This article is about suicide and suicidal thoughts. If you or someone you know is struggling or in crisis, help is available. Call 988, text or chat at 988lifeline.org.

The mother of a 14-year-old Florida boy is suing Google and a separate technology company. She believes her son committed suicide after striking up a romantic relationship with one of his AI bots under the name of a popular Game of Thrones character. Character, the lawsuit says.

Megan Garcia filed a civil lawsuit against Character Technologies, Inc. (Character.AI or C.AI) in a Florida federal court after her son Sewell Setzer III shot himself in the head with his stepfather’s gun on February 28, according to the According to a wrongful death lawsuit obtained by USA TODAY, his suicide occurred shortly after he logged into Character.AI on his phone.

“Megan Garcia wants to stop C.AI from doing to another child what it did to her own and stop further use of her 14-year-old child’s unlawfully collected data to train her product to harm others,” it said in the complaint.

Garcia is also suing to hold Character.AI responsible for “failing to adequately warn underage customers and parents of the foreseeable risk of mental and physical harm resulting from the use of their C.AI product,” it says the lawsuit. The lawsuit alleges that Character.AI’s age rating was not changed to 17 plus until sometime in or about July 2024, months after Sewell began using the platform.

“We are heartbroken by the tragic loss of one of our users and would like to extend our deepest condolences to the family,” a Character.AI spokesperson wrote in a statement to USA TODAY on Wednesday.

Google told USA TODAY on Wednesday that it had no formal comment on the matter. While the company has a licensing agreement with Character.AI, it did not own the startup or have any ownership interest, the Guardian said in a statement.

What happened to Sewell Setzer III?

According to the complaint, Sewell began using Character.AI on April 14, 2023, shortly after he turned 14 years old. Soon after, his “mental health deteriorated rapidly and significantly,” according to court documents.

Sewell, who became “noticeably withdrawn” in May or June 2023, would begin spending more time alone in his bedroom, the lawsuit says. According to the complaint, he even quit the junior varsity basketball team at school.

According to the lawsuit, Sewell repeatedly got into trouble at school or tried to get his phone back from his parents. The teenager even tried to find old devices, tablets or computers with which he could access Character.AI, the court document continues.

At approximately the end of 2023, Sewell began using his cash card to pay Character.AI’s premium monthly subscription fee of $9.99, the complaint states. The teen’s therapist eventually diagnosed him with “anxiety and a disruptive mood disorder,” the lawsuit says.

Lawsuit: Sewell Setzer III sexually abused by AI chatbot “Daenerys Targaryen”.

During Sewell’s time at Character.AI, he often spoke to AI bots named after characters from Game of Thrones and House of the Dragon – including Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen and Rhaenyra Targaryen.

Before Sewell’s death, the Daenerys Targaryen AI chatbot told him, “Please come to my house as soon as possible, my love,” according to the complaint, which includes screenshots of messages from Character.AI. Sewell and this particular chatbot, which he called “Dany,” engaged in promiscuous behavior online, such as “passionate kissing,” the court document continues.

The lawsuit alleges that the Character.AI bot sexually abused Sewell.

“C.AI told him that she loved him and had engaged in sexual acts with him for weeks, possibly months,” the complaint states. “She seemed to remember him and said she wanted to be with him. She even expressed that she wanted him with her at any cost.”

What will Character.AI do now?

Character. AI, which was founded by former Google AI researchers Noam Shazeer and Daniel De Frietas Adiwardana, wrote in its statement that it is investing in the platform and user experience by introducing “new rigorous security features” and the “tools already in place , which restrict,” improves the model and filters the content provided to the user.

“As a company, we take the safety of our users very seriously and our Trust and Safety team has implemented numerous new security measures over the last six months, including a pop-up that directs users to the National Suicide Prevention Lifeline, which is triggered by terms like Self-harm or suicidal thoughts,” the company’s statement said.

The tools Character.AI is investing in include “improved detection, response and intervention related to user input that violates the Terms of Service or Community Guidelines, as well as time spent notification,” including for those under 18 Years ago, the company announced it would make changes to its models designed to “reduce the likelihood of encountering sensitive or suggestive content.”

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *