After teen death lawsuits, Character.AI will restrict chats for under-18 users

After teen death lawsuits, Character.AI will restrict chats for under-18 users

On Wednesday, Character.AI announced it will bar anyone under the age of 18 from open-ended chats with its AI characters starting on November 25, implementing one of the most restrictive age policies yet among AI chatbot platforms. The company faces multiple lawsuits from families who say its chatbots contributed to teenager deaths by suicide.

Over the next month, Character.AI says it will ramp down chatbot use among minors by identifying them and placing a two-hour daily limit on their chatbot access. The company plans to use technology to detect underage users based on conversations and interactions on the platform, as well as information from connected social media accounts. On November 25, those users will no longer be able to create or talk to chatbots, though they can still read previous conversations. The company said it is working to build alternative features for users under the age of 18, such as the ability to create videos, stories, and streams with AI characters.

Character.AI CEO Karandeep Anand told The New York Times that the company wants to set an example for the industry. β€œWe’re making a very bold step to say for teen users, chatbots are not the way for entertainment, but there are much better ways to serve them,” Anand said in the interview. The company also plans to establish an AI safety lab.

Read full article

Comments

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *