Xiaomi open-sources MiMo-V2.5 with 1M context and MIT terms

Original: Xiaomi open-sources MiMo-V2.5 with 1M context and MIT terms View original →

Read in other languages: 한국어日本語
LLM Apr 27, 2026 By Insights AI 1 min read 1 views Source

Xiaomi MiMo’s April 27 X post is the kind of open-model drop that changes what developers can realistically test this week, not just what they can admire on a benchmark card. Xiaomi says MiMo-V2.5 is now officially open-sourced under the MIT license, with commercial deployment, continued training, and fine-tuning allowed without extra approval. That is a much stronger signal than a limited research preview.

“Two models, both supporting a 1M-token context window … MiMo-V2.5-Pro … ranking No.1 among open-source models on GDPVal-AA and ClawEval.”

The XiaomiMiMo account usually posts model, platform, and ecosystem updates around Xiaomi’s in-house foundation models, and this post links to both the weights and the technical material. The MiMo-V2.5 model card describes a native omnimodal model that handles text, image, video, and audio in one architecture. It lists 310 billion total parameters with 15 billion activated, training on roughly 48 trillion tokens, and a 1M-token context window. That is already enough to make the release notable.

The bigger machine is the MiMo-V2.5-Pro page. Xiaomi says the Pro model uses 1.02 trillion total parameters with 42 billion active, targets demanding agentic software work, and sustains long trajectories with thousands of tool calls. The card also says it was trained on 27 trillion tokens and includes deployment guidance for both SGLang and vLLM. That matters because many “open” releases still leave inference teams guessing. Here, Xiaomi is clearly trying to shorten the gap between upload day and production experiments.

What to watch next is whether the claimed benchmark lead holds up under broader independent testing and how expensive 1M-context inference proves outside carefully tuned demos. But the immediate takeaway is simple: Xiaomi did not just ship another weight file. It put a commercially permissive, long-context, agent-oriented model family directly into the hands of the open ecosystem.

Share: Long

Related Articles

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

© 2026 Insights. All rights reserved.