
On December 17, 2025, Xiaomi unveiled its new open large model MiMo‑V2‑Flash at its “Human‑Car‑Home” ecosystem partner conference in Beijing. The 309‑billion‑parameter model activates only 15 billion parameters at inference, is released under the MIT license, and Xiaomi says it ranks among the top open models on code and agent benchmarks.
This article aggregates reporting from 1 news source. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.
MiMo‑V2‑Flash shows how aggressively Chinese consumer electronics players are moving up the stack into frontier‑scale, open‑licensed models. A 309B‑parameter architecture with only 15B active parameters is a clear attempt to reconcile huge capacity with inference efficiency—exactly the trade‑off that will matter for AI‑native devices across phones, cars and homes. By shipping it under an MIT license and pitching strong code and agent performance, Xiaomi is courting the global open‑source developer community, not just its own ecosystem.
Strategically, this positions Xiaomi to embed a homegrown reasoning engine tightly into its “human‑car‑home” platform, from smartphones to EVs to smart appliances. If MiMo‑V2‑Flash really can outperform larger peers like DeepSeek variants on agent benchmarks while running faster, it becomes a compelling brain for embodied assistants and multi‑device orchestration. That’s exactly the kind of distributed, sensor‑rich environment where lessons for embodied and agentic AI will be learned.
For the AGI race, the message is that open, high‑end models are no longer the domain of Western labs and a handful of Chinese internet giants. Hardware OEMs with deep integration into the physical world are now building their own frontier‑class models, which could accelerate experimentation with real‑world agents and push the frontier in directions cloud‑only players don’t prioritize.



