China’s cyberspace regulator has unveiled draft rules aimed at tightening oversight of “digital human” technologies, requiring clear disclosure of virtual identities and restricting harmful uses of AI-generated personas.
The proposal, issued on Friday by the Cyberspace Administration of China, is open for public comment until May 6 and sets new compliance requirements for companies developing or deploying digital humans.
A central provision mandates that all virtual human content carry prominent “digital human” labels to prevent confusion between real and AI-generated identities.
The draft also moves to curb misuse in sensitive contexts, stating that platforms must not offer “virtual intimate relationships” to users under 18, a restriction designed to protect minors from exploitative engagement.
It further prohibits developers from using personal data without consent or deploying digital humans to bypass identity verification systems.
On content controls, the rules ban digital humans from engaging in activities that would “endanger national security, inciting subversion of state power, promoting secession or undermining national unity.”
Service providers are also instructed to block or limit content that is “sexually suggestive, depicts horror, cruelty or incites discrimination based on ethnicity or region,” according to the draft.
The guidelines additionally encourage platforms to intervene when users show signs of severe distress, advising providers to take “necessary measures” when individuals display suicidal or self-harming tendencies.
The move comes as China continues to push aggressive AI adoption across its economy under a new five-year policy blueprint, while simultaneously strengthening regulatory oversight to ensure alignment with state priorities and “socialist values.”
According to an analysis published on the regulator’s website, the rules are intended to close governance gaps in the emerging sector. It added that “the governance of digital virtual humans is no longer merely an issue of industry norms.”
