Xcode 26 intelligence features can be useful and I tend to use them to create units tests or provide some feedback. I do not let it change my code as all models have severe limitations IMO but can be a useful tool. Considering my light use case, while useful, I did not want to pay £/$20 a month for ChatGPT or Claude and these models have low limits in xcode. Z.AI GLM-4.6 coding module is currently quite capable and super cheap at $3 a month.
Xcode 26's Intelligence Mode doesn't work directly with Z.AI's GLM-4.6 API. You'll get an error: "Provider is not valid, Models could not be fetched with the provided account details"
This guide shows you how to set up a local proxy to use your $3/month GLM subscription with Xcode.
- Xcode 26 with Intelligence Mode
- Docker Desktop installed
- GLM Coding Plan subscription from z.ai/subscribe
- Your Z.AI API Key from z.ai
mkdir ~/glm-proxy
cd ~/glm-proxyservices:
litellm:
image: ghcr.io/berriai/litellm:main-latest
container_name: glm-proxy
ports:
- "4000:4000"
volumes:
- ./litellm_config.yaml:/app/config.yaml
command: --config /app/config.yaml
restart: unless-stoppedmodel_list:
- model_name: glm-4.6
litellm_params:
model: openai/glm-4.6
api_base: https://api.z.ai/api/coding/paas/v4
api_key: your-zai-api-key-hereReplace your-zai-api-key-here with your actual API key!
docker compose up -dcurl http://localhost:4000/v1/modelsYou should see JSON with the GLM-4.6 model listed.
- Open Xcode → Settings (⌘,)
- Go to Intelligence tab
- Click "Add a Model Provider"
- Select "Locally Hosted"
- Enter Port:
4000 - Click Save
GLM-4.6 should now appear in your model list!
If the container keeps restarting, check the logs:
docker logs glm-proxyIf you see IsADirectoryError: Is a directory: '/app/config.yaml':
docker compose down
rm -rf litellm_config.yaml
# Recreate the litellm_config.yaml file (step 3)
docker compose up -dThis happens when Docker creates a directory instead of mounting the file.
cd ~/glm-proxy
docker compose downCost: $3/month for GLM Coding Plan vs $20+/month for ChatGPT/Claude
Credits: Uses LiteLLM proxy