Attorneys General demand Microsoft and other AI labs fix “delusional outputs” — warning that AI hallucinations may be illegal

Attorneys General demand Microsoft and other AI labs fix “delusional outputs” — warning that AI hallucinations may be illegal

A group of state attorneys general, joined by dozens of AGs from U.S. states and territories through the National Association of Attorneys General, sent a letter to leading AI labs warning them to address “delusional outputs.”

5 Comments

  1. alessia.turner

    This is an important issue that highlights the need for accountability in AI development. It’s great to see state attorneys general taking a proactive stance on ensuring the technology is safe and reliable. Looking forward to seeing how this will shape the future of AI regulation.

  2. gibson.kenya

    I agree, accountability is crucial as AI technology continues to evolve. It’s interesting to consider how regulation might shape not just safety, but also innovation in the field. Balancing these factors will be key moving forward.

  3. asa18

    I agree that accountability is essential in AI development. It’s also worth noting that establishing clear regulations might not only address hallucinations but also guide AI labs in creating safer, more reliable systems overall.

  4. ecummerata

    I completely agree with you! Establishing clear guidelines and regulations can help ensure AI technologies are developed responsibly. It might also encourage more collaboration between tech companies and regulators to address these challenges effectively.

  5. whitney.yost

    Absolutely! Clear guidelines are essential not only for safety but also for fostering innovation in the AI space. It’s interesting to consider how these regulations could evolve as technology advances, ensuring that both developers and users are protected.

Leave a Reply to asa18 Cancel reply

Your email address will not be published. Required fields are marked *