OpenCode
// integrations hub

OpenCode integrations: editors, models, and adapters.

OpenCode is editor-agnostic and model-agnostic by construction. Every editor extension talks to the same CLI, and every model adapter speaks the same wire protocol. Pick the editor your team already uses, pick the model your security policy allows, and OpenCode slots into both without forcing a migration.

Integration registry.

The table below is the canonical registry of editor extensions and model adapters that ship with OpenCode or are certified by the project. Status flags track whether an integration is officially maintained, community-maintained with a signed manifest, or in preview. Each entry links to a dedicated page with configuration and troubleshooting notes.

Zero-click summary. Two editor extensions are official. Four editor extensions are community-certified. Eight model adapters ship in the reference config. Any OpenAI-compatible endpoint is supported without a new adapter.

IntegrationTypeInstallDocsStatus
OpenCode VSCodeEditor extensionVSCode marketplaceVSCode guideOfficial
OpenCode JetBrainsEditor extensionJetBrains plugin repoJetBrains guideOfficial
OpenCode NeovimEditor extensionPlugin managerExtension hubCommunity certified
OpenCode HelixEditor extensionConfig snippetExtension hubCommunity certified
OpenCode ZedEditor extensionZed extensionsExtension hubCommunity certified
OpenCode SublimeEditor extensionPackage ControlExtension hubCommunity certified
OllamaLocal adapterBuilt inOllama setupOfficial
OpenAI-compatibleHosted adapterBuilt inCustom APIOfficial
AnthropicHosted adapterBuilt inCustom APIOfficial
Azure OpenAIHosted adapterBuilt inCustom APIOfficial
vLLMSelf-hostedOpenAI-compatibleCustom APIOfficial
TGISelf-hostedOpenAI-compatibleCustom APIOfficial
LM StudioLocal adapterBuilt inCustom APIOfficial

How the editor extensions share the wire protocol.

OpenCode editor extensions do not embed model logic. They open a local Unix socket, hand a session token to the OpenCode CLI, and stream plan/apply events over a line-delimited JSON protocol. That design keeps every extension light — the VSCode extension is under a megabyte, the JetBrains plugin is a single JAR, and the Neovim plugin is a Lua file with no native dependencies. When you upgrade the CLI, every extension upgrades with it, because the heavy lifting lives in one binary.

The wire protocol is deliberately small. There are four message types: plan, apply, tool-call, and status. An extension sends a plan request with the user prompt and the active selection, the CLI returns a plan the user can approve, and then the apply message streams diffs back to the extension to render inline. Tool calls — reading files, running tests, shell commands — flow through the CLI so one policy applies whether you drove the agent from the terminal or from an inline diff button.

Zero-click summary. Every editor extension is a thin UI. The CLI is the only component that speaks to models. One policy covers all surfaces.

How the model adapters negotiate capabilities.

OpenCode treats model adapters as a capability negotiation. At startup the CLI asks the adapter which features are available: streaming, function calls, native tool calls, structured output, vision input, and long context. The agent then picks the cheapest tool-call schema the model supports — native function calling when available, a JSON tool-call fallback when not — and routes prompts accordingly. That means a GPT-class model, a Claude-class model, and a small local Ollama model all work with the same agent loop, even though the wire formats differ.

The adapters themselves are short. The OpenAI-compatible adapter is under 300 lines and covers any endpoint that speaks the chat completions schema, which includes Azure OpenAI, vLLM, TGI, LM Studio, and most in-house gateways. The Anthropic adapter is similarly compact and maps OpenCode tool calls onto the Claude messages API. The Ollama adapter speaks the native Ollama HTTP API directly so it can pick up local model metadata — quantization, context length, embeddings — without an extra round trip.

Zero-click summary. Adapters negotiate capabilities at startup. One agent loop, many wire formats. Fallbacks cover models without native tool calls.

Editor extensions: official, community, and preview.

The integration registry distinguishes three tiers so teams know what to expect. Official extensions are built and signed by the OpenCode maintainers, pass the integration test suite on every release, and ship on the same cadence as the CLI. Community-certified extensions are written by contributors outside the core team, but they pass the integration test suite and carry a signed manifest so enterprise teams can verify provenance. Preview extensions are new work that has not yet met the certification bar — they are welcome in the registry but are flagged so no one is surprised.

The VSCode and JetBrains extensions are the only two in the official tier today. The Neovim, Helix, Zed, and Sublime extensions are community-certified, which in practice means the maintainers contribute upstream patches when the wire protocol adds a capability and the extensions track those changes on a predictable cadence. We expect the community tier to grow — that is the point of a wire-protocol-first design — but the bar for official is signing, test coverage, and a maintenance SLA.

VSCode extension.

The OpenCode VSCode extension is the most feature-complete editor integration. It adds a side panel, inline diff review, selection-scoped prompts, a status-bar transcript, and a shortcut map that covers every CLI verb. The extension is distributed through the Visual Studio Code marketplace and is signed with the same key used for the CLI binaries so endpoint management teams can verify both from one attestation.

JetBrains plugin.

The OpenCode JetBrains plugin covers IntelliJ IDEA, PyCharm, GoLand, WebStorm, Rider, and CLion. It exposes OpenCode as a tool window with the same plan/apply flow as the CLI, plus an action registry so shortcuts can be rebound to match local conventions. The plugin ships through the JetBrains plugin repository and supports every currently maintained IDE major version.

Community editor plugins.

The OpenCode extension hub tracks the community-certified plugins for Neovim, Helix, Zed, and Sublime Text. Each plugin exposes the same plan/apply flow as the CLI, with UX choices that respect the host editor — Neovim uses the quickfix list for diffs, Helix uses the picker UI, Zed uses its action palette, and Sublime uses the command palette and output panel.

Model adapters: hosted, self-hosted, and local.

OpenCode groups model adapters into three categories so a security architect can reason about each separately. Hosted adapters talk to a vendor API — OpenAI-compatible, Anthropic, Azure OpenAI — and send prompts off your network. Self-hosted adapters talk to an inference server you run inside your perimeter, such as vLLM or TGI, and the only network traffic is inside your VPC. Local adapters talk to an inference server on the same machine as the developer, such as Ollama or LM Studio, and do not emit a packet off the box. The custom API page has full configuration examples for each.

Teams in regulated industries usually start with a hosted adapter, graduate to a self-hosted deployment once they have a budget for GPUs, and add a local adapter for offline work. OpenCode is the same binary across all three — you change one config block to switch tiers. The agent does not assume the model is frontier, so smaller local models run the same plan/apply flow, just with shorter horizons per task.

Our release engineering pipeline follows the NIST SSDF 1.1 controls for signed distributions, which is why the integration registry lists signing status per entry. For supply-chain posture on third-party adapters we reference guidance from the NIST C-SCRM program when reviewing preview integrations.

Zero-click summary. Three adapter tiers: hosted, self-hosted, local. Change one config block to move tiers. Signing status is tracked per entry.

Rolling out OpenCode across a mixed editor team.

Most teams we talk to are not monolithic. The backend engineers live in Neovim, the frontend team is on VSCode, the data science team runs PyCharm, and the platform team toggles between all three. A plugin-locked coding agent forces those teams to pick a side, which is rarely politically viable. OpenCode avoids that problem by design — the CLI is the canonical surface and each editor just wraps it.

The rollout script most teams use: install the CLI on every developer workstation through the endpoint management system, pin a default model adapter in a shared config, and let each engineer install the editor extension they prefer. The CLI handles logging, auditing, and policy enforcement uniformly, so the platform team does not care which editor the patch came from. Any compliance report reads the same way whether the engineer drove OpenCode from VSCode, JetBrains, or a Neovim buffer in tmux.

The JetBrains plugin and the VSCode extension share a transcript format, so our review workflow does not branch by editor. That is the detail that sold the platform team.

— Sibylle D. Achterberg, VP Platform, Oakcatalyst

Swapping from a hosted adapter to our self-hosted vLLM cluster was a four-line config change. The agent behavior was indistinguishable.

— Yusra E. Rahmati, Principal Developer, Montevale Digital

Related integrations and guides.

Frequently asked

Integration questions developers ask first.

Five short answers covering editor extensions, model adapters, certification status, and mixed-editor rollouts. Follow the link at the end of each answer for full detail.

Which OpenCode editor extensions are officially maintained?
The VSCode extension and the JetBrains plugin are the two official OpenCode editor integrations. Both are built by the core maintainers, signed with the same key as the CLI, and released on the same cadence. The extension hub lists community-certified options for Neovim, Helix, Zed, and Sublime Text.
How does OpenCode decide which model adapter to use?
OpenCode reads the provider block in your config file and instantiates the matching adapter at startup. The CLI then asks the adapter which capabilities the target model supports — streaming, native tool calls, long context — and routes prompts accordingly. The custom API page documents every provider key and the capability negotiation in detail.
Can OpenCode run with only local models?
Yes. Point the Ollama adapter at a local socket — the default install covers this — and the agent will not emit a packet off the box. Recommended local models and RAM budgets are on the Ollama setup page. Teams with regulatory constraints often run a local model for everyday edits and escalate to a hosted model only for hard refactors.
How are community extensions certified for enterprise use?
Community plugins enter the extension hub at the community-certified tier once they pass the integration test suite and ship a signed release manifest. The OpenCode maintainers do not audit line-by-line, but the signing requirement gives procurement teams a provenance trail they can verify against the CLI release keys.
Do I need a JetBrains license to use the OpenCode JetBrains plugin?
You need a license for the JetBrains IDE itself — IntelliJ, PyCharm, GoLand — but the OpenCode JetBrains plugin is free under the same permissive license as the rest of the project. The plugin page lists minimum supported IDE versions and the feature parity table.