Now Z.ai has an official solution to use the package. See below.
First we need to create a custom provider for the Z.ai anthropic API. Follow the instructions for adding a custom provider to opencode.
- Name it
zai-anthropic - Enter your API-key
Edit opencode.json and add this:
"provider": {
"zai-anthropic": {
"npm": "@ai-sdk/anthropic",
"options": {
"baseURL": "https://api.z.ai/api/anthropic/v1"
},
"models": {
"glm-4.5": {
"name": "glm-4.5"
}
}
}Make this model default if you wish:
"model": "zai-anthropic/glm-4.5",

@olafgeibig It was important for me to find a limit and LiteLLM allows you to count. Yesterday I used 32,000,000 tokens and made 540 calls to the model my lite plan, even with this volume I did not reach the limit. I also tried air in the calls of the tools very badly with the copilot.
Most likely they have an internal counting system. I did not find information on the official website.
based on this information from 1200 model calls in 5 hours limit.
Compared to others, this is the best offer there can be.