LOADING...
Summarize
Xiaomi's new open-source AI model takes on OpenAI, DeepSeek
The model is available globally

Xiaomi's new open-source AI model takes on OpenAI, DeepSeek

Dec 17, 2025
07:57 pm

What's the story

Xiaomi has launched an open-source artificial intelligence (AI) model, MiMo-V2-Flash. The new system is designed to compete with the latest offerings from DeepSeek, Moonshot AI, Anthropic, and OpenAI. The model is available globally on Xiaomi's developer platform MiMo Studio as well as Hugging Face and API Platform. As per Xiaomi, MiMo-V2-Flash excels in reasoning, coding, and agentic scenarios while also being a great general-purpose assistant for daily tasks.

AGI advancement

MiMo-V2-Flash: A step toward artificial general intelligence

Chinese AI expert Luo Fuli, who recently joined Xiaomi's MiMo team from DeepSeek, said in a post on X that MiMo-V2-Flash is "step 2 on our AGI road map." AGI or artificial general intelligence refers to a theoretical form of AI that can match or exceed human cognitive abilities. This statement highlights the potential of Xiaomi's new model in the field of advanced AI development.

Efficiency claims

A cost-effective high-performance AI model

Xiaomi has claimed that MiMo-V2-Flash was built for maximum efficiency. The company said it offers "blazing-fast inference at 150 tokens per second," with an ultra-low cost of $0.1 per million input tokens, and $0.3 per million output tokens. This makes it one of the most cost-effective high-performance models on the market today, according to Xiaomi.

Benchmark achievements

MiMo-V2-Flash's performance surpasses competitors

Xiaomi has claimed that MiMo-V2-Flash matches the performance of Moonshot AI's Kimi K2 Thinking and DeepSeek V3.2 Thinking on most reasoning benchmarks. The company also said its latest model outperforms Kimi K2 Thinking in long-context evaluations. On agentic tasks, MiMo-V2-Flash scored 73.4% on SWE-Bench Verified to beat all open-source competitors while coming close to OpenAI's GPT-5-High performance levels.