r/LocalLLaMA Questions OpenCode’s Local Story After Finding `serve` Proxies the UI to app.opencode.ai
Original: OpenCode concerns (not truely local) View original →
The community argument was about architecture, not model quality
On March 16, 2026, a r/LocalLLaMA post criticizing OpenCode’s local behavior reached 389 points and 154 comments. The complaint was not that the tool performs badly. It was that users may assume opencode serve gives them a fully local web UI when the product boundary is more complicated. The Reddit author argued that the UI path is still proxied through app.opencode.ai, which matters for people running behind firewalls, on intranets, or under stricter privacy expectations.
The claim is backed by a specific code reference. In the linked commit, packages/opencode/src/server/server.ts contains a proxy call that forwards requests to https://app.opencode.ai and rewrites the host header accordingly. That does not by itself prove any broader data-handling claim beyond the architecture shown in code, but it does establish that local server access and fully local UI hosting are not the same thing in the referenced implementation.
Why the distinction matters for local AI users
Developers increasingly separate three promises: local inference, local server control, and local interface hosting. A tool can satisfy the first two while still depending on a remote frontend. For some teams that is acceptable. For enterprises on restricted networks, or for users choosing local software because they want explicit control over the entire path, the difference has to be documented and configurable.
The linked GitHub activity shows this is not a one-off Reddit misunderstanding. Open issue #12083 describes an intranet setup where the local page cannot connect, and open PR #12446 proposes an OPENCODE_APP_URL option so organizations can point the web UI proxy to a custom internal location instead of app.opencode.ai. In the referenced state, that PR remains open.
- The Reddit thread focused on whether the OpenCode web UI is fully local in practice.
- The linked code path proxies requests to
https://app.opencode.aiand rewrites the host header. - Issue #12083 documents a pure intranet failure case for the current web path.
- PR #12446 proposes a configurable app URL for self-hosted or internal web deployments.
The thread landed because it turned a vague “not truly local” feeling into a code-level product boundary that users can inspect and debate. As AI developer tools mature, those deployment boundaries are becoming part of the product itself.
Sources: Reddit discussion, linked code path, issue #12083, PR #12446
Related Articles
LocalLLaMA was not impressed by another TTS clip so much as by a build log. The post that took off showed Qwen3-TTS running locally in real time, quantized through llama.cpp, with extra alignment work to make subtitles and lip sync behave.
LocalLLaMA upvoted this because a 27B open model suddenly looked competitive on agent-style work, not because everyone agreed on the benchmark. The thread stayed lively precisely because the result felt important and a little suspicious at the same time.
LocalLLaMA seized on Anthropic’s postmortem as confirmation of a fear the subreddit repeats constantly: when the model is hosted, the person paying for it may not control what “the same model” means from week to week.
Comments (0)
No comments yet. Be the first to comment!