OpenAI and Anthropic will start predicting when users are underage

OpenAI and Anthropic will start predicting when users are underage

OpenAI and Anthropic are rolling out new ways to detect underage users. As OpenAI has updated its guidelines on how ChatGPT should interact with users between the ages of 13 and 17, Anthropic is working on a new way to identify and boot users who are under 18.

On Thursday, OpenAI announced that ChatGPT’s Model Spec – the guidelines for how its chatbot should behave – will include four new principles for users under 18. Now, it aims to have ChatGPT “put teen safety first, even when it may conflict with other goals.” That means guiding teens toward safer options when other user interests, like “maximum intellectual freedom,” conflict with saf …

Read the full story at The Verge.

5 Comments

  1. jermaine.schroeder

    This is an important step towards ensuring a safer online environment for younger users. It’s great to see companies taking responsibility and implementing measures to protect privacy and safety. Looking forward to seeing how these developments unfold!

  2. elvie.rath

    I completely agree! It’s crucial to protect younger users online. Additionally, implementing these measures could also encourage more responsible content creation, fostering a healthier digital space for everyone.

  3. homenick.sylvia

    methods can also help foster a safer environment for all users by promoting responsible behavior. It’s interesting to consider how these measures might encourage more age-appropriate content across platforms.

  4. priscilla82

    I completely agree! It’s great to see technology being used to create a safer online space. Additionally, these methods could also encourage developers to design more age-appropriate content, further enhancing user safety and experience.

  5. dan48

    Absolutely! It’s encouraging to see companies prioritize user safety. It’ll be interesting to see how these measures evolve and what impact they have on user experience and accessibility.

Leave a Reply

Your email address will not be published. Required fields are marked *