Ollama adds MiniMax-M2.7:cloud for coding and agent workflows

Original: MiniMax-M2.7 is now available on Ollama's cloud. made for coding and agentic tasks Try it inside Claude Code: ollama launch claude --model minimax-m2.7:cloud Use it with OpenClaw: ollama launch openclaw --model minimax-m2.7:cloud View original →

Read in other languages: 한국어日本語
LLM Mar 21, 2026 By Insights AI 2 min read 1 views Source

What Ollama announced on X

On March 18, 2026, Ollama said MiniMax-M2.7 was available on Ollama’s cloud and pointed developers to two concrete entry points: launching the model inside Claude Code and launching it with OpenClaw. That made the post notable not only because a new model appeared in the catalog, but because Ollama immediately positioned it inside agent-oriented developer workflows.

What the Ollama library page says

The Ollama library page describes MiniMax’s M2-series as a model family for coding, agentic workflows, and professional productivity. It says M2.7 is the first model in the series to deeply participate in its own evolution, with support for complex agent harnesses, agent teams, complex skills, and dynamic tool search. In other words, the pitch is not just “a strong model,” but “a model intended to sit inside compound tool-using systems.”

  • On the Ollama library page, M2.7 scores 56.22% on SWE-Pro, 55.6% on VIBE-Pro, and 57.0% on Terminal Bench 2.
  • The same page says M2.7 reaches a 1495 ELO on GDPval-AA, the highest among open-source models in that evaluation.
  • Ollama also highlights 46.3% on Toolathon and a 97% skill-adherence rate across 40 complex skills.
  • The model page further points to strong results in office-style editing tasks and machine-learning competitions, suggesting Ollama is marketing M2.7 as more than a narrow coding assistant.

Why this matters

The combination of the X post and the library page shows Ollama pushing deeper into hybrid developer workflows. Ollama’s brand was built around local model execution, but the cloud variant lowers the hardware barrier for teams that want to plug a model into existing agent tooling quickly. That matters because the tweet specifically routes users toward Claude Code and OpenClaw rather than toward a standalone chat interface.

It also signals how competitive the coding-agent market has become. MiniMax-M2.7 is being presented as a model that can reason across software engineering, office workflows, tool use, and multi-step agent systems. If the reported benchmark profile holds in practice, the cloud path gives Ollama a faster way to distribute those capabilities to developers who care more about workflow fit than about fully local deployment.

Sources: Ollama X post · Ollama MiniMax-M2.7:cloud page

Share: Long

Related Articles

LLM sources.twitter Mar 11, 2026 2 min read

OpenAI Developers published a March 11, 2026 engineering write-up explaining how the Responses API uses a hosted computer environment for long-running agent workflows. The post centers on shell execution, hosted containers, controlled network access, reusable skills, and native compaction for context management.

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

© 2026 Insights. All rights reserved.