Setting Up Claude Code to Work with OpenRouter Using LiteLLM

When we first install Claude Code CLI, it’s hardwired to talk to Anthropic’s API. But what if you want to use open-source models instead like Qwen or Kimi using OpenRouter?
That’s where LiteLLM comes in. It acts as a local proxy that can translate Claude Code’s Anthropic-style requests into whatever OpenRouter (or any other provider) expects.
In this post, I’ll walk through how I got Claude Code to work with OpenRouter’s qwen/qwen3-coder: free model using LiteLLM along with all the missteps I made, and what finally worked.
What We’re Setting Up
Claude Code CLI installed locally
LiteLLM proxy server running on localhost
Requests from Claude CLI routed to OpenRouter
Fake Claude model names (like claude-3.5-sonnet) mapped to real OpenRouter models behind the scenes
Step 1: Install Dependencies
We’ll LiteLLM as a local proxy.
Install it via pip:
python3 -m venv LiteLLM
pip3 install "litellm[proxy]"
Step 2: Set Environment Variables
Set your secrets and endpoint overrides in your shell config (~/.zshrc or ~/.bashrc):
export LITELLM_MASTER_KEY=sk-1234 # This can be anything
export OPENROUTER_API_KEY=sk-your-openrouter-key
export ANTHROPIC_BASE_URL=http://localhost:4000
export ANTHROPIC_AUTH_TOKEN=$LITELLM_MASTER_KEY
Then reload your shell config:
source ~/.zshrc
Step 3: Create Your LiteLLM Config
Save the following YAML into config.yaml. This is what tells LiteLLM to act like it’s Anthropic, but actually forward requests to OpenRouter’s Qwen model.
model_list:
- model_name: "claude-3.5-sonnet"
litellm_params:
model: "openrouter/qwen/qwen3-coder:free"
model_provider: openrouter
api_key: os.environ/OPENROUTER_API_KEY
api_base: https://openrouter.ai/api/v1
litellm_settings:
master_key: os.environ/LITELLM_MASTER_KEY
database_type: none
Note that LiteLLM tends to be annoyingly pushy to make us use database. Hence, its important to add this field database_type: none
.
Also, the trick here is in model_name. Claude Code thinks it’s calling claude-3.5-sonnet, but our proxy maps that name to Qwen.
Step 4: Start the LiteLLM Proxy
Run the proxy server:
litellm --config config.yaml
To make sure it’s working:
curl http://localhost:4000/health \
-H "Authorization: Bearer $LITELLM_MASTER_KEY"
You should see a healthy status with your model name listed.
You can also visit http://localhost:4000/health to ensure working.
Step 5: Use Claude CLI
Now that everything is in place, launch Claude with:
claude --model claude-3.5-sonnet
If everything went well, Claude will start, but it’ll actually be using OpenRouter’s Qwen model behind the scenes.
Mistakes I Made
1. The model name didn’t match
At one point, LiteLLM showed the model as /claude-3.5-sonnet instead of claude-3.5-sonnet. That happened because I accidentally added a leading slash. Claude CLI failed with a “model not found” error.
Fix: Make sure the model_name
you enter in the config.yaml
matches with the field you enter with the claude --model claude-3.5-sonnet
model_name: "claude-3.5-sonnet"
2. The wrong model ID for OpenRouter
I tried using "moonshot/kimi-k2:free" which OpenRouter didn’t support. It threw a model not found error.
Fix: Use curl https://openrouter.ai/api/v1/models
with your API key to check which models are available and free.
3. Database errors
At one point I saw:
ModuleNotFoundError: No module named 'prisma'
This happened when I tried running features that require LiteLLM’s database integration, without setting up a DB.
Fix: Add this to your config to disable DB usage:
database_type: none
4. “No API key provided” even though I had one
This was just a case of forgetting to run:
source ~/.zshrc
after setting environment variables. Don’t skip that step.
Wrapping Up
Now I can run Claude CLI locally, while routing it through any OpenRouter-supported model, all while keeping the Anthropic API format that Claude Code expects.
This setup is super flexible. You could point Claude to Mistral, Moonshot, or even your own LLM behind a custom endpoint as long as LiteLLM can proxy to it.
There go 4 hours of my life I’ll neve get back :)
Subscribe to my newsletter
Read articles from Abhishek Patil directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
