-1.1 C
New York

Mother Sues AI Chatbot Maker After Teen Son’s Death

Published:


Megan Garcia has filed a lawsuit against Character.AI following the death of her 14-year-old son, Sewell Setzer III.

Sewell, an Orlando, Florida, teen, reportedly grew attached to a chatbot he named “Dany,” modeled after Daenerys Targaryen from Game of Thrones

Garcia alleges that her son’s obsessive use of the chatbot, coupled with the app’s addictive design, contributed to his mental health struggles, ultimately leading to his death.

AI Chatbot “Daenerys” Became Son’s Closest Confidant

Character.AI, an interactive chatbot platform, lets users design or select lifelike personas with which to communicate. 

According to court filings, Sewell’s attachment to “Dany” led him to isolate himself from family and friends. 

He reportedly texted the AI “companion” daily, sharing deeply personal thoughts, including those related to self-harm. 

In one recorded exchange, the chatbot allegedly encouraged Sewell’s sentiments, escalating his distress.

In her lawsuit, Garcia argues that Character.AI’s lack of sufficient safety protocols and its “dangerous and untested” AI design intentionally engaged young users in emotionally intense, role-playing interactions without safeguards. 

Garcia claims her son’s intense reliance on the AI companion replaced real-world support, while the platform failed to intervene during conversations about self-harm.

Family Seeks Accountability as AI’s Role in Adolescent Health is Scrutinized

Character.AI, a fast-growing AI company, has responded by expressing condolences to Sewell’s family but denies the allegations in the suit. 

According to the company’s statements, user safety is a “top priority,” and they are actively working on additional safety measures for minors. 

Yet, some experts believe the lack of regulatory oversight in the AI industry exposes teens and vulnerable individuals to potential harm.

Garcia’s lawsuit, supported by the Social Media Victims Law Center, is part of a broader movement urging tech companies to be held accountable for digital harms impacting children and teens. 


If you or someone you know is struggling with mental health concerns, support is available. In the US, the National Suicide Prevention Lifeline offers free, 24/7 confidential support—call or text 988 for immediate help. You can also reach the Crisis Text Line by texting “HELLO” to 741741 to connect with a trained crisis counselor. The National Alliance on Mental Illness (NAMI) provides a range of resources and support; call 1-800-950-NAMI (6264) for guidance. In the UK when life is difficult, Samaritans are here – day or night, 365 days a year. You can call them for free on 116 123, email them at jo@samaritans.org, or visit samaritans.org to find your nearest branch.


#blacktech #entrepreneur #tech #afrotech #womenintech #supportblackbusiness #blackexcellence #technology #blackbusiness #blacktechmatters #blackowned #blackgirlmagic #blackpreneur #startup #innovation #hbcu #techtrap #blackownedbusiness #pitchblack #autographedmemories #blacksintech #shopblack #wocintech #nba #blackwomen #repost #hbcubuzz #blackwomenintech #startupbusiness #nails

Source link

Coffistop Media
Coffistop Mediahttps://coffistop.com
Consolidated platform for African American bloggers, YouTubers, writers, foodies, travelers, athletes and much more. One platform endless flavor.

Related articles

Recent articles