Skip to content

Instantly share code, notes, and snippets.

@MRKMKR
Created October 24, 2025 13:26
Show Gist options
  • Select an option

  • Save MRKMKR/a0a3ab23c402ab79cf10dd5e544dee51 to your computer and use it in GitHub Desktop.

Select an option

Save MRKMKR/a0a3ab23c402ab79cf10dd5e544dee51 to your computer and use it in GitHub Desktop.
Using GLM-4.6 with Xcode 26 via LiteLLM Proxy
services:
litellm:
image: ghcr.io/berriai/litellm:main-latest
container_name: glm-proxy
ports:
- "4000:4000"
volumes:
- ./litellm_config.yaml:/app/config.yaml
command: --config /app/config.yaml
restart: unless-stopped
model_list:
- model_name: glm-4.6
litellm_params:
model: openai/glm-4.6
api_base: https://api.z.ai/api/coding/paas/v4
api_key: your-zai-api-key-here

Overview

Xcode 26 intelligence features can be useful and I tend to use them to create units tests or provide some feedback. I do not let it change my code as all models have severe limitations IMO but can be a useful tool. Considering my light use case, while useful, I did not want to pay £/$20 a month for ChatGPT or Claude and these models have low limits in xcode. Z.AI GLM-4.6 coding module is currently quite capable and super cheap at $3 a month.

The Problem

Xcode 26's Intelligence Mode doesn't work directly with Z.AI's GLM-4.6 API. You'll get an error: "Provider is not valid, Models could not be fetched with the provided account details"

This guide shows you how to set up a local proxy to use your $3/month GLM subscription with Xcode.

What You Need

  • Xcode 26 with Intelligence Mode
  • Docker Desktop installed
  • GLM Coding Plan subscription from z.ai/subscribe
  • Your Z.AI API Key from z.ai

Setup Steps

1. Create a project folder

mkdir ~/glm-proxy
cd ~/glm-proxy

2. Create docker-compose.yml

services:
  litellm:
    image: ghcr.io/berriai/litellm:main-latest
    container_name: glm-proxy
    ports:
      - "4000:4000"
    volumes:
      - ./litellm_config.yaml:/app/config.yaml
    command: --config /app/config.yaml
    restart: unless-stopped

3. Create litellm_config.yaml

model_list:
  - model_name: glm-4.6
    litellm_params:
      model: openai/glm-4.6
      api_base: https://api.z.ai/api/coding/paas/v4
      api_key: your-zai-api-key-here

Replace your-zai-api-key-here with your actual API key!

4. Start the proxy

docker compose up -d

5. Verify it's working

curl http://localhost:4000/v1/models

You should see JSON with the GLM-4.6 model listed.

Configure Xcode

  1. Open XcodeSettings (⌘,)
  2. Go to Intelligence tab
  3. Click "Add a Model Provider"
  4. Select "Locally Hosted"
  5. Enter Port: 4000
  6. Click Save

GLM-4.6 should now appear in your model list!

Common Issue: Directory Error

If the container keeps restarting, check the logs:

docker logs glm-proxy

If you see IsADirectoryError: Is a directory: '/app/config.yaml':

docker compose down
rm -rf litellm_config.yaml
# Recreate the litellm_config.yaml file (step 3)
docker compose up -d

This happens when Docker creates a directory instead of mounting the file.

Stopping the Proxy

cd ~/glm-proxy
docker compose down

Cost: $3/month for GLM Coding Plan vs $20+/month for ChatGPT/Claude

Credits: Uses LiteLLM proxy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment