Xiaomi releases MITlicensed MiMo models for longrunning AI agents
Briefly

Xiaomi releases MITlicensed MiMo models for longrunning AI agents
"On ClawEval, V2.5-Pro lands at 64% Pass^3 using only ~70K tokens per trajectory - roughly 40-60% fewer tokens than Claude Opus 4.6, Gemini 3.1 Pro, and GPT-5.4 at comparable capability levels."
"The 310-billion-parameter MiMo-V2.5 activates only 15 billion parameters per request, while the 1.02-trillion-parameter Pro version activates 42 billion."
"The Pro model's hybrid attention design can reduce KV-cache storage by nearly seven times during long-context tasks."
"The MIT License allows enterprises to freely modify, deploy, and commercialize the model without restrictions, which is rare in today's AI landscape."
Xiaomi has launched MiMo-V2.5 and MiMo-V2.5-Pro, open-sourced under the MIT License, targeting developers of autonomous coding and workflow agents. Both models feature a 1-million-token context window, with MiMo-V2.5-Pro aimed at complex tasks. The models utilize a sparse mixture-of-experts design to optimize compute costs. The MIT License allows for unrestricted commercial deployment and modification. MiMo-V2.5-Pro demonstrates efficiency, using fewer tokens than competitors while maintaining comparable capabilities, making it a viable option for enterprises facing budget constraints in AI workloads.
Read at InfoWorld
Unable to calculate read time
[
|
]