CW: This episode features discussion of suicide and sexual abuse.
In the last episode, we had the journalist Laurie Segall on to talk about the tragic story of Sewell Setzer, a 14 year old boy who took his own life after months of abuse and manipulation by an AI companion from the company Character.ai. The question now is: what's next?
Megan has filed a major new lawsuit against Character.ai in Florida, which could force the company–and potentially the entire AI industry–to change its harmful business practices. So today on the show, we have Meetali Jain, director of the Tech Justice Law Project and one of the lead lawyers in Megan's case against Character.ai. Meetali breaks down the details of the case, the complex legal questions under consideration, and how this could be the first step toward systemic change. Also joining is Camille Carlton, CHT’s Policy Director.
RECOMMENDED MEDIA
Further reading on Sewell’s story
Laurie Segall’s interview with Megan Garcia
The full complaint filed by Megan against Character.AI
Further reading on suicide bots
Further reading on Noam Shazier and Daniel De Frietas’ relationship with Google
The CHT Framework for Incentivizing Responsible Artificial Intelligence Development and Use
Organizations mentioned:
The Social Media Victims Law Center
Mothers Against Media Addiction
RECOMMENDED YUA EPISODES
When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer
Jonathan Haidt On How to Solve the Teen Mental Health Crisis
AI Is Moving Fast. We Need Laws that Will Too.
Corrections:
Meetali referred to certain chatbot apps as banning users under 18, however the settings for the major app stores ban users that are under 17, not under 18.
Meetali referred to Section 230 as providing “full scope immunity” to internet companies, however Congress has passed subsequent laws that have made carve outs for that immunity for criminal acts such as sex trafficking and intellectual property theft.
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X.