Home WebMail Thursday, October 31, 2024, 10:26 PM | Calgary | -3.1°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Posted: 2024-10-25T18:34:22Z | Updated: 2024-10-25T18:34:22Z

The mother of a Florida boy who died by suicide in February filed a lawsuit against an artificial intelligence technology company on Wednesday, saying a chatbot drove her child to take his own life.

Sewell Setzer III, 14, was described in the lawsuit as an incredibly intelligent and athletic child. Last year, his family noticed him withdrawing and acting up in school and saw an overall decline in his mental health. A therapist assessed that Sewells problems were caused by some sort of addiction, but neither the therapist nor his parents knew the true source of his issues, the lawsuit said.

After Sewell died by suicide on the night of Feb. 29, his mother, Megan Garcia, discovered that for the 10 months leading up to his death, he had been speaking with several AI chatbots. According to the lawsuit, he had fallen in love with one of the bots, and it had encouraged him to kill himself.

Matthew P. Bergman, the lawyer Garcia retained after her sons death and the founding attorney of the Social Media Victims Law Center, told HuffPost that Sewell was shy and on the autism spectrum. The teen enjoyed being outside and playing basketball before he started talking to the chatbots, said Bergman, who characterized the bots as grooming the teen.