Ollama models arrive in VS Code's GitHub Copilot Chat picker

Original: Visual Studio Code now integrates with Ollama via GitHub Copilot. If you have Ollama installed, any local or cloud model from Ollama can be selected for use within Visual Studio Code. View original →

Read in other languages: 한국어日本語
LLM Mar 27, 2026 By Insights AI 1 min read 1 views Source

What Ollama said on X

On March 26, 2026, Ollama said Visual Studio Code now integrates with Ollama through GitHub Copilot, so any local or cloud model available in Ollama can be selected directly inside VS Code. That matters because the integration is not framed as a side extension or a separate chat panel. It moves Ollama models into the same model picker many developers already use for Copilot-powered workflows.

What the docs add

Ollama’s VS Code integration page says this path requires Ollama v0.18.3 or newer, VS Code 1.113 or newer, and GitHub Copilot Chat extension 0.41.0 or newer. It also says users still need to sign in to use the model selector, but a paid GitHub Copilot subscription is not required; GitHub Copilot Free is enough to enable custom model selection.

Setup is deliberately lightweight. Ollama documents a one-command flow, ollama launch vscode, plus a manual path where users open Copilot Chat, bring up the Language Models window, add Ollama as a provider, and unhide the models they want. The docs also explicitly support cloud models, not just local weights, which means the same VS Code surface can bridge laptop-resident models and larger hosted ones.

Why it matters

The practical shift is that local and self-directed model choice is becoming a first-class workflow in a mainstream editor instead of an enthusiast setup. Developers can stay inside GitHub Copilot Chat while choosing Ollama-served models for privacy, cost control, or open-model experimentation, then fall back to hosted models when they need more scale. That lowers switching cost between closed and open ecosystems and gives local AI tooling a much more normal path into day-to-day software work.

Source: Ollama X post · Ollama VS Code docs

Share: Long

Related Articles

LLM sources.twitter 5d ago 2 min read

Ollama said on March 18, 2026 that MiniMax-M2.7 was available through its cloud path and could be launched from Claude Code and OpenClaw. The Ollama library page describes the M2-series model as a coding- and productivity-focused system with strong results on SWE-Pro, VIBE-Pro, Terminal Bench 2, GDPval-AA, and Toolathon.

GitHub Turns Copilot and Figma into a Bidirectional MCP Workflow
LLM sources.twitter Mar 15, 2026 2 min read

GitHub said on March 10, 2026 that GitHub Copilot, VS Code, and Figma now form a continuous loop through the bidirectional Figma MCP server. GitHub’s March 6 changelog says users can pull design context into code and send rendered UI back to Figma as editable frames.

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

© 2026 Insights. All rights reserved.