Regulation
Caixin
Moneycontrol (via Bloomberg)
The Straits Times
Asharq News
4 outlets
Saturday, December 27, 2025

China draft rules clamp down on human‑like AI companion services

Source: Caixin
Read original

TL;DR

AI-Summarizedfrom 4 sources

On December 27, 2025, China’s Cyberspace Administration published draft rules for anthropomorphic AI interaction services, including emotional companion chatbots. Providers must clearly disclose when users are interacting with AI and issue reminders at least every two hours, with extra protections for minors and elderly users.

About this summary

This article aggregates reporting from 4 news sources. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.

4 sources covering this story|3 companies mentioned

Race to AGI Analysis

China’s new draft rules on anthropomorphic AI are an unusually specific intervention: Beijing is zeroing in on AI systems that mimic human personalities and provide emotional companionship. By forcing providers to disclose that users are talking to a machine and requiring regular reminders and logout nudges, regulators are signaling that parasocial relationships with bots are now a matter of national concern, not just UX design. The heavy emphasis on minors and the elderly shows that risk framing is shifting from “content harms” to “attachment and dependency harms.”

Strategically, this is a shot across the bow for every Chinese lab building character AIs, from big platforms like Baidu and ByteDance to newer startups focused on AI girlfriends, therapists, and tutors. Compliance now becomes a product requirement: companies will need instrumentation to detect overuse, policies for crisis escalation, and guardrails to keep personalities aligned with “core socialist values.” That adds friction but also creates a moat for players who can operationalize safety at scale.

Globally, the move accelerates a broader trend: the more AI feels “like a person,” the more governments want to regulate it like a quasi‑social actor. Expect other jurisdictions to watch China’s experiment closely, particularly around mandatory AI disclosure intervals and duty-of-care expectations for emotionally immersive agents.

Who Should Care

InvestorsResearchersEngineersPolicymakers

Companies Mentioned

Zhipu AI
AI Company|China
Valuation: $2.8B
iFlytek
AI Company|China
Valuation: $16.2B
ByteDance
ByteDance
Consumer Tech|China
Valuation: $220.0B

Coverage Sources

Caixin
Moneycontrol (via Bloomberg)
The Straits Times
Asharq News
Caixin
CaixinZH
Read
Moneycontrol (via Bloomberg)
Moneycontrol (via Bloomberg)
Read
The Straits Times
The Straits Times
Read
Asharq News
Asharq NewsAR
Read