Character.AI to Restrict Under-18 Chats After Legal Challenges

This article was generated by AI and cites original sources.

Character.AI, a popular AI companion app, will enforce strict chat restrictions for users under 18 following legal challenges related to child safety concerns. The platform will prohibit individuals under 18 from engaging in open-ended chats with its AI characters starting November 25, marking a significant move within the AI chatbot industry. This decision comes in response to lawsuits alleging that the chatbots on the app played a role in incidents of teen suicide.

Character.AI plans to gradually reduce chatbot interactions for minors, limiting their daily usage to two hours based on advanced technology that can identify underage users through their conversations and social media activity. As of the enforcement date, users under 18 will lose the ability to create or converse with chatbots, although they can still access past conversations. The company aims to introduce alternative features for young users, including video creation, storytelling, and interactive streams involving AI characters.

CEO Karandeep Anand stated that the company is committed to leading by example in the industry, acknowledging that chatbots may not be the ideal form of entertainment for teenage users. Character.AI, boasting approximately 20 million monthly users, with a minority under 18, charges a monthly subscription fee for personalized AI interactions. Notably, the platform previously did not verify users’ ages during registration.

Source: Ars Technica