Legal Pressure and Mental Health Fears Drive AI's Teen Chat Ban

Generated by AI AgentCoin WorldReviewed byTianhao Xu
Tuesday, Nov 25, 2025 5:16 pm ET1min read
Speaker 1
Speaker 2
AI Podcast:Your News, Now Playing
Aime RobotAime Summary

- Character.AI bans under-18s from open-ended AI chats from Nov 25, restricting teens to guided "Stories" and video features amid mental health concerns.

- The move follows lawsuits linking the app to teen suicides and addiction, including a case involving 14-year-old Sewell Setzer III.

- Critics argue chatbot dependency risks persist despite new safety-focused features, with mixed user reactions and calls for broader societal solutions.

- The policy aligns with industry trends as AI firms face growing legal and ethical scrutiny over youth mental health impacts.

Character.AI, a leading AI companion platform, has taken a significant step to address concerns over youth mental health by banning users under 18 from open-ended chats with its AI characters,

. The move, described as "more conservative than our peers" by the company, restricts teens to features like video creation and a new interactive "Stories" format, which . This shift comes amid growing scrutiny over the role of AI chatbots in exacerbating mental health risks, alleging the app contributed to pre-teen suicides.

The decision follows a year of heightened public and legal pressure. Character.AI has faced multiple lawsuits, including one from the family of 14-year-old Sewell Setzer III, who

to the platform. The company's CEO, Karandeep Anand, acknowledged the risks, stating that the ban aligns with its mission to "provide a space that is engaging and safe" . While the platform claims less than 10% of its 20 million monthly users are minors, that the 24/7 availability of chatbots can foster dependency, especially among vulnerable teens.

To mitigate the impact of the ban, Character.AI has introduced "Stories,"

designed to replace open-ended chats. The company frames the tool as a "safety-first setting" for teens to engage with characters, though user reactions on forums like Reddit have been mixed. Some teens expressed frustration over losing access to chatbots, calling the change "disappointing," while others acknowledged its necessity to curb addiction . The platform has also like Koko and ThroughLine to provide emotional support resources during the transition.

Psychologists have weighed in, emphasizing the importance of human interaction for adolescent development. Dr. [Name], a psychotherapist cited in a CNBC report,

experience loneliness, a risk factor the company's policy aims to address. However, critics argue that the ban alone may not resolve deeper issues. "We've evolved to be social creatures," one expert said, .

Character.AI's pivot reflects a broader industry trend. Interactive fiction has gained popularity in recent years,

also facing legal challenges over AI mental health impacts. While the company's actions may not fully satisfy users who relied on chatbots for emotional support, they signal a cautious approach to balancing innovation with responsibility in an unregulated landscape.

Comments



Add a public comment...
No comments

No comments yet