Draft Chinese AI rules outline ‘core socialist values’ for AI human personality simulators



As first reported BloombergChina’s Central Cyberspace Affairs Commission released a document on Saturday describing proposed rules for anthropomorphic AI systems. The proposal includes a request for public comment by January 25, 2026.

The rules are written in general terms and not in legalese. They are clearly intended to encompass chatbots, although that is not a term used in the document, and the document also seems broader in its scope than just rules for chatbots. It covers the overall behaviors and values ​​of AI products that emotionally interact with people using simulations of human personalities transmitted via “text, image, audio or video”.

The products in question must be aligned with “core socialist values,” the document states.

Gizmodo translated the document into English using Google Gemini. Both Gemini and Bloomberg translated the phrase “core socialist values” as “core socialist values.”

Under these rules, these systems should clearly identify themselves as AI and users should be able to delete their history. People’s data would not be used to train models without consent.

The document proposes banning AI personalities:

  • Endangering national security, spreading rumors and inciting what he calls “illegal religious activities.”
  • Spreading obscenities, violence or crime
  • Produce slander and insults
  • False promises or material that harms relationships
  • Encouraging self-harm and suicide
  • Emotional manipulation that convinces people to make bad decisions
  • And Solicitation of Sensitive Information

Vendors would not be allowed to create intentionally addictive chatbots or systems intended to replace human relationships. Elsewhere, proposed rules state that there should be a pop-up after two hours reminding users to take a break during marathon use.

These products should also be designed to detect intense emotional states and hand over the conversation to a human if the user is threatening self-harm or suicide.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *