|
2 | 2 | title: IDE and tool integrations |
3 | 3 | description: Configure popular AI coding assistants and tools to use Docker Model Runner as their backend. |
4 | 4 | weight: 40 |
5 | | -keywords: Docker, ai, model runner, cline, continue, cursor, vscode, ide, integration, openai, ollama |
| 5 | +keywords: Docker, ai, model runner, cline, continue, cursor, vscode, ide, integration, openai, ollama, claude, anthropic, claude-code |
6 | 6 | --- |
7 | 7 |
|
8 | 8 | Docker Model Runner can serve as a local backend for popular AI coding assistants |
@@ -258,6 +258,37 @@ print(response.text) |
258 | 258 |
|
259 | 259 | You can find more details in [this Docker Blog post](https://www.docker.com/blog/opencode-docker-model-runner-private-ai-coding/) |
260 | 260 |
|
| 261 | +## Claude Code |
| 262 | + |
| 263 | +[Claude Code](https://claude.com/product/claude-code) is [Anthropic's](https://www.anthropic.com/) command-line tool for agentic coding. It lives in your terminal, understands your codebase, and executes routine tasks, explains complex code, and handles Git workflows through natural language commands. |
| 264 | + |
| 265 | +### Configuration |
| 266 | + |
| 267 | +1. Install Claude Code (see [docs](https://code.claude.com/docs/en/quickstart#step-1-install-claude-code)) |
| 268 | +2. Use the `ANTHROPIC_BASE_URL` environment variable to point Claude Code at DMR. On Mac or Linux, you can do this, for example if you want to use the `gpt-oss:32k` model: |
| 269 | + ```bash |
| 270 | + ANTHROPIC_BASE_URL=http://localhost:12434 claude --model qwen2.5-coder |
| 271 | + ``` |
| 272 | + On Windows (PowerShell) you can do it like this: |
| 273 | + ```powershell |
| 274 | + $env:ANTHROPIC_BASE_URL="http://localhost:12434" |
| 275 | + claude --model gpt-oss:32k |
| 276 | + ``` |
| 277 | + |
| 278 | +> [!TIP] |
| 279 | +> |
| 280 | +> To avoid setting the variable each time, add it to your shell profile (`~/.bashrc`, `~/.zshrc`, or equivalent): |
| 281 | +> |
| 282 | +> ```shell |
| 283 | +> export ANTHROPIC_BASE_URL=http://localhost:12434 |
| 284 | +> ``` |
| 285 | + |
| 286 | +You can find more details in [this Docker Blog post](https://www.docker.com/blog/run-claude-code-locally-docker-model-runner/) |
| 287 | + |
| 288 | +> [!NOTE] |
| 289 | +> |
| 290 | +> While the other integrations on this page use the [OpenAI-compatible API](/ai/model-runner/api-reference/#openai-compatible-api), DMR also exposes a [Anthropic-compatible API](/ai/model-runner/api-reference/#anthropic-compatible-api) used here. |
| 291 | + |
261 | 292 | ## Common issues |
262 | 293 |
|
263 | 294 | ### "Connection refused" errors |
|
0 commit comments