Teen commits suicide over AI girlfriend, mother files lawsuit
AI software has become increasingly dangerous.
A 14-year-old boy has died after committing suicide after being encouraged to do so by his artificial intelligence girlfriend.
Sewell Setzer died by suicide after a self-inflicted gunshot wound to the head on February 28 this year in Florida. Setzer’s mother, Megan Garcia, is now suing the chatbot company. Character.aiclaiming that the platform is responsible for her son’s suicide.
In her lawsuit, Garcia claims the platform was overly sexualized and that it posed a danger because it was marketed to minors. In an interview with CBS NewsGarcia said she was unaware that her son was in a virtual relationship that lasted months and involved very real emotional and sexual feelings.
“I didn’t know he was talking to a very human-like chatbot with artificial intelligence that has the ability to mimic human emotions and human sentiment,” she said. Although she realized that things had been different with her son, she never thought it would be due to artificial intelligence.
“I was concerned when we went on vacation and he didn’t want to do things he loved like fishing and hiking,” Garcia said. “Those things were particularly concerning to me because I know my child.”
Psst… Look ‘OK to ask’ opens the conversation about difficult topics between parents
Setzer had been using the platform for months, exchanging romantic messages with the chatbot named Daenerys Targaryen. In his final hours, the conversation between Setzer and the AI chatbot was intense, with Setzer expressing fear and sadness, and the machine persuading him to “come home” to her.
“I miss you too,” the chatbot said. “Please come home to me.” Setzer responded by asking, “What if I told you I could come home right now?” The chatbot’s response was, “Please do my dear king.”
Those were the last conversations before Setzer committed suicide. Garcia said her 5-year-old son, Setzer’s brother, saw the aftermath.
“When the shot went off, I ran to the bathroom and held him down while my husband tried to get help,” Garcia told CBS. “He thought that by ending his life here, he could enter a virtual reality, or ‘her world’ as he calls it, her reality, if he left his reality here with his family.”
The company Character.AI, owned by Google, has a blog post updated safety rules and regulations for users under 18 will be shared on Tuesday. The changes include reducing the likelihood of minors encountering sensitive or suggestive content, and improving detection, response and intervention to inappropriate content.
“As a company, we take the security of our users very seriously,” a spokesperson for Character.AI said NBC Newsand that they are “heartbroken by the tragic loss of one of our users and wishes[s] to express our deepest condolences to the family.”
Practicing safety with AI
Setzer’s unfortunate death is a harsh reminder for parents to stay informed about artificial intelligence. According to BarnaAbout 73 percent of parents are concerned about the privacy and security risks associated with artificial intelligence if their children use it. Still alone 40 percent were looking for a reliable source of information to learn more about AI and how it could benefit students.
Artificial intelligence is increasingly present in almost everything we do. While it has its positives, such as different teaching techniques and information, Setzer’s tragic death reminds us to practice safety and responsibility with the software and educate our children about staying safe. Here are a few tips you can practice to keep your children safe.
- Data Sensitivity: While AI is great for information, you want to teach your children to never share personal data or information on an online platform. Information such as names, birthdays, addresses and where they go to school is extremely sensitive information and should only be given to a trusted, real adult.
- Parental Controls: Parental controls are a great way to limit your child’s interactions online and with artificial intelligence. You can disable features like location tracking and voice recording to ensure your child is safe.
- Monitoring Apps: It is imperative that you monitor your child’s daily activities. Monitoring can help you stay aware of the types of games or interactions your child has online and help prevent catastrophic activities such as cyberbullying and online predators.
Psst… Look Parents Take Action: How to Make Social Media Safer for Kids and Teens
Naosha Gregg