Ollama models arrive in VS Code's GitHub Copilot Chat picker
Original: Visual Studio Code now integrates with Ollama via GitHub Copilot. If you have Ollama installed, any local or cloud model from Ollama can be selected for use within Visual Studio Code. View original →
What Ollama said on X
On March 26, 2026, Ollama said Visual Studio Code now integrates with Ollama through GitHub Copilot, so any local or cloud model available in Ollama can be selected directly inside VS Code. That matters because the integration is not framed as a side extension or a separate chat panel. It moves Ollama models into the same model picker many developers already use for Copilot-powered workflows.
What the docs add
Ollama’s VS Code integration page says this path requires Ollama v0.18.3 or newer, VS Code 1.113 or newer, and GitHub Copilot Chat extension 0.41.0 or newer. It also says users still need to sign in to use the model selector, but a paid GitHub Copilot subscription is not required; GitHub Copilot Free is enough to enable custom model selection.
Setup is deliberately lightweight. Ollama documents a one-command flow, ollama launch vscode, plus a manual path where users open Copilot Chat, bring up the Language Models window, add Ollama as a provider, and unhide the models they want. The docs also explicitly support cloud models, not just local weights, which means the same VS Code surface can bridge laptop-resident models and larger hosted ones.
Why it matters
The practical shift is that local and self-directed model choice is becoming a first-class workflow in a mainstream editor instead of an enthusiast setup. Developers can stay inside GitHub Copilot Chat while choosing Ollama-served models for privacy, cost control, or open-model experimentation, then fall back to hosted models when they need more scale. That lowers switching cost between closed and open ecosystems and gives local AI tooling a much more normal path into day-to-day software work.
Source: Ollama X post · Ollama VS Code docs
Related Articles
Ollama said on March 18, 2026 that MiniMax-M2.7 was available through its cloud path and could be launched from Claude Code and OpenClaw. The Ollama library page describes the M2-series model as a coding- and productivity-focused system with strong results on SWE-Pro, VIBE-Pro, Terminal Bench 2, GDPval-AA, and Toolathon.
GitHub said on March 10, 2026 that GitHub Copilot, VS Code, and Figma now form a continuous loop through the bidirectional Figma MCP server. GitHub’s March 6 changelog says users can pull design context into code and send rendered UI back to Figma as editable frames.
GitHub said on March 20, 2026 that Copilot code review has surpassed 60 million reviews. The company’s March 5 blog says usage is up 10x since launch, now covers more than one in five code reviews on GitHub, and relies on an agentic architecture tuned for higher-signal feedback.
Comments (0)
No comments yet. Be the first to comment!