If you don't have a file at .claude/settings.local.json --> create one with
{
"env": {
},
"permissions": {
"allow": [],
"deny": [],
"ask": []
}
}
Because openrouter is only using an openai compatible endpoint with this proxy it will be compatible to Anthropic api style /v1/messages
I'm testing this right now if I'm exceeding my daily quota.
Once you have the docker running with docker run -d -p 5000:3000 --env-file .env ghcr.io/kiyo-e/claude-code-proxy:latest
localhost:5000/in browser will show that the proxy is started and what is configured.- Start claude code and it will use the proxy if you switched to it.
- If you switch back you're having your Claude code subscription (restart the CLI or close/open the extension)
Use ./switch-model.sh to toggle between Claude code subscription and glm via openrouter
The script sets ANTHROPIC_BASE_URL=http://localhost:5000 in the settings file claude code will use or removes it.
Port can be adjusted as you like.
Note
Openrouter is not free if you change the models below e.g. to 4.6
MIT
- Works on Linux (Ubuntu 25.04 (Plucky))
Windows user
- Need to download jq and add it to PATH (or modify the script to the exe location)
- Either use WSL or run the script with
bash switch-model.sh
Note
The last version that is probably supporting this is 1.0.127 until it is fixed - The recent change to 2.x is a problem. Probably settings.local.json or the env handling changed.
Also
/statusis missing in 2.0.5, so it is hard to test if model selection via our base is working.Couldn't get it back to work yet. Also pinning the VS Code version was not working for me.
I need to check that again.
I've seen the version topic in issue Claude code #8522
I also have to check the toggeling again and if it is enough to open/close Claude Code extension.