Claude Code + LM Studio + Qwen3-VL-30B

Run Claude Code locally using LM Studio as an OpenAI-compatible backend with Qwen3-VL-30B, bridged via claude-code-router.

1. Install LM Studio

bash
brew install --cask lm-studio

Install CLI:

bash
brew install lmstudio

Verify:

bash
lms version

2. Download Model

bash
lms get qwen/qwen3-vl-30b

3. Verify Downloaded Models

bash
lms ls

4. Start LM Studio Server

bash
lms server start --port 1234

5. Load Model

bash
lms load qwen/qwen3-vl-30b --context-length 32768

Verify:

bash
lms ps

6. Test OpenAI-Compatible Endpoint

bash
curl http://localhost:1234/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "qwen/qwen3-vl-30b",
    "messages": [
      {
        "role": "user",
        "content": "Say OK"
      }
    ],
    "max_tokens": 64
  }'

Expected:

json
{
  "choices": [
    {
      "message": {
        "content": "OK"
      }
    }
  ]
}

7. Install claude-code-router

bash
npm install -g @musistudio/claude-code-router

Create config:

bash
mkdir -p ~/.claude-code-router

cat > ~/.claude-code-router/config.json <<'JSON'
{
  "LOG": true,
  "APIKEY": "lmstudio",
  "HOST": "127.0.0.1",
  "PORT": 3456,
  "Providers": [
    {
      "name": "lmstudio",
      "api_base_url": "http://localhost:1234/v1/chat/completions",
      "api_key": "lmstudio",
      "models": [
        "qwen/qwen3-vl-30b"
      ],
      "transformer": {
        "use": ["openai"]
      }
    }
  ],
  "Router": {
    "default": "lmstudio,qwen/qwen3-vl-30b"
  }
}
JSON

8. Start Bridge

bash
ccr start

9. Configure Environment

bash
unset ANTHROPIC_API_KEY
export ANTHROPIC_AUTH_TOKEN="lmstudio"
export ANTHROPIC_BASE_URL="http://127.0.0.1:3456"

10. Run Claude Code

bash
claude --model "qwen/qwen3-vl-30b" --print "Say HI"