Claude Code + LM Studio + Qwen3-VL-30B
Run Claude Code locally using LM Studio as an OpenAI-compatible backend with Qwen3-VL-30B, bridged via claude-code-router.
1. Install LM Studio
bash
brew install --cask lm-studioInstall CLI:
bash
brew install lmstudioVerify:
bash
lms version2. Download Model
bash
lms get qwen/qwen3-vl-30b3. Verify Downloaded Models
bash
lms ls4. Start LM Studio Server
bash
lms server start --port 12345. Load Model
bash
lms load qwen/qwen3-vl-30b --context-length 32768Verify:
bash
lms ps6. Test OpenAI-Compatible Endpoint
bash
curl http://localhost:1234/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "qwen/qwen3-vl-30b",
"messages": [
{
"role": "user",
"content": "Say OK"
}
],
"max_tokens": 64
}'Expected:
json
{
"choices": [
{
"message": {
"content": "OK"
}
}
]
}7. Install claude-code-router
bash
npm install -g @musistudio/claude-code-routerCreate config:
bash
mkdir -p ~/.claude-code-router
cat > ~/.claude-code-router/config.json <<'JSON'
{
"LOG": true,
"APIKEY": "lmstudio",
"HOST": "127.0.0.1",
"PORT": 3456,
"Providers": [
{
"name": "lmstudio",
"api_base_url": "http://localhost:1234/v1/chat/completions",
"api_key": "lmstudio",
"models": [
"qwen/qwen3-vl-30b"
],
"transformer": {
"use": ["openai"]
}
}
],
"Router": {
"default": "lmstudio,qwen/qwen3-vl-30b"
}
}
JSON8. Start Bridge
bash
ccr start9. Configure Environment
bash
unset ANTHROPIC_API_KEY
export ANTHROPIC_AUTH_TOKEN="lmstudio"
export ANTHROPIC_BASE_URL="http://127.0.0.1:3456"10. Run Claude Code
bash
claude --model "qwen/qwen3-vl-30b" --print "Say HI"