Skip to content

Instantly share code, notes, and snippets.

@kcosr
Created January 24, 2026 17:26
Show Gist options
  • Select an option

  • Save kcosr/d2e946d550f71dd4b8aa9d82fcdc6140 to your computer and use it in GitHub Desktop.

Select an option

Save kcosr/d2e946d550f71dd4b8aa9d82fcdc6140 to your computer and use it in GitHub Desktop.
CLI Provider for pi

CLI Provider for Pi

Overview

This design document describes integrating external CLI tools (like Claude CLI, Codex CLI) as a new Provider type in pi. The goal is to create an abstraction layer that takes streaming JSON output from external CLIs and converts it into pi's native AssistantMessageEvent format.

Background

Assistant App's CLI Integration

The assistant app (packages/agent-server) already integrates with pi CLI via piCliChat.ts. Key observations:

  1. Spawning: Runs pi --mode json -p <message> to get JSONL streaming output
  2. Event parsing: Parses each JSON line and extracts:
    • message_update events containing assistantMessageEvent with text/thinking deltas
    • tool_execution_start, tool_execution_update, tool_execution_end events
    • session header for session ID/cwd
  3. Callbacks: Converts events to assistant app's internal format via callbacks (onTextDelta, onToolCallStart, etc.)

Pi's Provider Architecture

Pi's @mariozechner/pi-ai package defines providers via:

  1. Api type: Union of supported APIs (e.g., "anthropic-messages", "openai-completions")
  2. Model<TApi> interface: Configuration for a specific model including baseUrl, api type, etc.
  3. stream() function: Dispatches to provider-specific stream functions
  4. AssistantMessageEventStream: Async iterable that emits AssistantMessageEvents

Design

New API Type: external-cli

Add a new API type for external CLI providers:

// In types.ts
export type Api =
  | "openai-completions"
  | "openai-responses"
  // ... existing APIs ...
  | "external-cli";  // NEW

External CLI Options

// In providers/external-cli.ts
export interface ExternalCliOptions extends StreamOptions {
  /** Path to the CLI executable (e.g., "claude", "codex", "/usr/local/bin/my-cli") */
  executable?: string;
  /** Extra CLI arguments to append */
  extraArgs?: string[];
  /** Working directory for the CLI process */
  workdir?: string;
  /** Environment variables to set/override for the CLI process */
  env?: Record<string, string>;
  /** Timeout in milliseconds (default: none) */
  timeout?: number;
  /** Session ID for CLIs that support session persistence */
  cliSessionId?: string;
}

Model Configuration

// Example model definition for Claude CLI
const claudeCliModel: Model<"external-cli"> = {
  id: "claude-cli",
  name: "Claude CLI",
  api: "external-cli",
  provider: "anthropic-cli",
  baseUrl: "",  // Not used for CLI
  reasoning: true,
  input: ["text", "image"],
  cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },  // Cost handled by CLI
  contextWindow: 200000,
  maxTokens: 8192,
  // CLI-specific configuration stored in model
  cliConfig: {
    executable: "claude",
    outputFormat: "jsonl",  // or "sse" for server-sent events
  }
};

Event Stream Protocol

The CLI provider expects the external CLI to output JSONL events. Two protocol flavors:

Protocol A: Pi-style Events (preferred)

CLIs that output pi-compatible events:

{"type":"session","id":"abc123","cwd":"/path/to/project"}
{"type":"message_start","message":{"role":"assistant","content":[],...}}
{"type":"message_update","message":{...},"assistantMessageEvent":{"type":"text_delta","delta":"Hello","contentIndex":0}}
{"type":"tool_execution_start","toolCallId":"call_1","toolName":"Read","args":{"path":"file.txt"}}
{"type":"tool_execution_update","toolCallId":"call_1","toolName":"Read","partialResult":{"content":[{"type":"text","text":"..."}]}}
{"type":"tool_execution_end","toolCallId":"call_1","toolName":"Read","result":{...},"isError":false}
{"type":"message_end","message":{...}}

Protocol B: Simple Events (for simpler CLIs)

Minimal protocol for CLIs without full event support:

{"type":"text","delta":"Hello"}
{"type":"thinking","delta":"Let me think..."}
{"type":"tool_call","id":"call_1","name":"bash","arguments":{"command":"ls"}}
{"type":"tool_result","id":"call_1","content":"file1.txt\nfile2.txt"}
{"type":"done"}

Stream Implementation

// In providers/external-cli.ts
export function streamExternalCli(
  model: Model<"external-cli">,
  context: Context,
  options?: ExternalCliOptions,
): AssistantMessageEventStream {
  const stream = new AssistantMessageEventStream();

  (async () => {
    const output: AssistantMessage = {
      role: "assistant",
      content: [],
      api: "external-cli",
      provider: model.provider,
      model: model.id,
      usage: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, totalTokens: 0, cost: {...} },
      stopReason: "stop",
      timestamp: Date.now(),
    };

    try {
      const child = spawnCli(model, context, options);
      stream.push({ type: "start", partial: output });

      for await (const event of parseCliOutput(child.stdout)) {
        const converted = convertToAssistantEvent(event, output);
        if (converted) {
          stream.push(converted);
        }
      }

      const exitCode = await waitForExit(child);
      if (exitCode !== 0) {
        throw new Error(`CLI exited with code ${exitCode}`);
      }

      stream.push({ type: "done", reason: output.stopReason, message: output });
      stream.end();
    } catch (error) {
      output.stopReason = options?.signal?.aborted ? "aborted" : "error";
      output.errorMessage = error instanceof Error ? error.message : String(error);
      stream.push({ type: "error", reason: output.stopReason, error: output });
      stream.end();
    }
  })();

  return stream;
}

CLI Spawning

function spawnCli(
  model: Model<"external-cli">,
  context: Context,
  options?: ExternalCliOptions,
): ChildProcess {
  const executable = options?.executable || model.cliConfig?.executable || model.id;
  
  // Build arguments
  const args: string[] = [];
  
  // Most CLIs use --mode json for JSONL output
  args.push("--mode", "json");
  
  // Pass model if different from CLI default
  if (model.cliConfig?.modelArg) {
    args.push("--model", model.cliConfig.modelArg);
  }
  
  // Thinking/reasoning level
  if (options?.reasoning) {
    args.push("--thinking", options.reasoning);
  }
  
  // Extra args from options
  if (options?.extraArgs) {
    args.push(...options.extraArgs);
  }
  
  // Non-interactive mode with prompt
  // The prompt is serialized context (last user message)
  const lastUserMessage = context.messages.findLast(m => m.role === "user");
  const prompt = typeof lastUserMessage?.content === "string" 
    ? lastUserMessage.content 
    : lastUserMessage?.content.map(c => c.type === "text" ? c.text : "").join("\n");
  
  args.push("-p", prompt);

  // Spawn with environment
  const env = { ...process.env, ...options?.env };
  const child = spawn(executable, args, {
    cwd: options?.workdir,
    env,
    stdio: ["pipe", "pipe", "pipe"],
    detached: process.platform !== "win32",
  });
  
  return child;
}

Event Conversion

function convertToAssistantEvent(
  cliEvent: CliEvent,
  output: AssistantMessage,
): AssistantMessageEvent | null {
  switch (cliEvent.type) {
    case "message_update": {
      // Pi-style event
      const inner = cliEvent.assistantMessageEvent;
      if (inner?.type === "text_delta") {
        updateTextContent(output, inner.contentIndex, inner.delta);
        return { type: "text_delta", contentIndex: inner.contentIndex, delta: inner.delta, partial: output };
      }
      if (inner?.type === "thinking_delta") {
        updateThinkingContent(output, inner.contentIndex, inner.delta);
        return { type: "thinking_delta", contentIndex: inner.contentIndex, delta: inner.delta, partial: output };
      }
      break;
    }
    
    case "text": {
      // Simple protocol
      const idx = ensureTextContent(output);
      (output.content[idx] as TextContent).text += cliEvent.delta;
      return { type: "text_delta", contentIndex: idx, delta: cliEvent.delta, partial: output };
    }
    
    // ... handle other event types
  }
  
  return null;
}

Files to Update

packages/ai/

  • src/types.ts - Add "external-cli" to Api type, add ExternalCliOptions to ApiOptionsMap
  • src/providers/external-cli.ts - New file: CLI provider implementation
  • src/stream.ts - Add case for "external-cli" in stream() function
  • src/index.ts - Export new provider

packages/coding-agent/

  • src/core/model-registry.ts - Support loading CLI-based models from settings
  • src/config.ts - Add CLI provider configuration options

packages/ai/src/utils/

  • src/utils/cli-output-parser.ts - New file: JSONL parsing utilities

Configuration Example

In pi's settings file or model definitions:

{
  "models": [
    {
      "id": "claude-cli/sonnet",
      "name": "Claude CLI (Sonnet)",
      "api": "external-cli",
      "provider": "claude-cli",
      "cliConfig": {
        "executable": "claude",
        "modelArg": "claude-sonnet-4-20250514"
      }
    },
    {
      "id": "codex-cli",
      "name": "OpenAI Codex CLI",
      "api": "external-cli",
      "provider": "codex-cli",
      "cliConfig": {
        "executable": "codex",
        "outputFormat": "jsonl"
      }
    }
  ]
}

Session Ownership Model

CLI is Authoritative (Recommended Approach)

The external CLI owns the conversation state. Pi acts as a display/UI wrapper that shadows the session for its own purposes.

┌─────────────────────────────────────────────────────────┐
│  Pi (Display/UI Layer)                                  │
│  ┌─────────────────────────────────────────────────────┐│
│  │ pi session file (shadow copy)                       ││
│  │ - mirrors CLI events for display                    ││
│  │ - stores pi-specific metadata (labels, branches)    ││
│  └─────────────────────────────────────────────────────┘│
│                          │                              │
│                          ▼                              │
│  ┌─────────────────────────────────────────────────────┐│
│  │ CLI Provider                                        ││
│  │ - spawns CLI with --session <id>                    ││
│  │ - streams events → pi session file                  ││
│  │ - on resume: CLI loads its own session              ││
│  └─────────────────────────────────────────────────────┘│
└─────────────────────────────────────────────────────────┘
                           │
                           ▼
┌─────────────────────────────────────────────────────────┐
│  External CLI (claude, codex, etc.)                     │
│  - owns conversation state                              │
│  - owns tools (Read, Bash, Edit, Write)                 │
│  - manages its own session files                        │
└─────────────────────────────────────────────────────────┘

Key Principles

  1. CLI owns tools: The CLI has its own tools and invokes them directly. Pi does not expose tools to the CLI - it just displays tool execution events.

  2. CLI owns context: The CLI handles context management, compaction, and LLM communication. Pi doesn't reconstruct context for the LLM.

  3. Pi shadows for display: Pi writes CLI events to its own session format for:

    • Displaying conversation history in the UI
    • Supporting pi-specific features (labels, search, branches)
    • Fast session list/preview without spawning CLI
  4. Resume via CLI: On session resume, pi spawns CLI with --session X --continue. The CLI loads its own history and handles context.

Session Flow

Start new session:

  1. Pi generates session ID (e.g., abc123)
  2. Spawns: claude --session abc123 --mode json -p "prompt"
  3. CLI creates its session, streams JSONL events
  4. Pi writes events to shadow session file

Resume session:

  1. Pi loads shadow session file for immediate display
  2. Spawns: claude --session abc123 --continue --mode json
  3. CLI loads its session, ready for new prompts
  4. New events append to pi's shadow

Continue conversation:

  1. User types new prompt in pi
  2. Spawns: claude --session abc123 --mode json -p "new prompt"
  3. CLI continues from its session state

Handling Direct CLI Use (Edge Case)

If user runs CLI directly outside of pi, pi's shadow copy becomes stale.

Detection (future): On resume, pi could request CLI's session state:

claude --session abc123 --export-history --format jsonl

Reconciliation strategies:

  1. Append-only merge: If CLI has entries pi doesn't, append them
  2. Mark divergence: If histories differ, show warning, offer to resync
  3. Lazy sync: Show shadow immediately, reconcile when CLI outputs session header

MVP approach: No automatic merge. CLI is authoritative - if user used CLI directly, they can continue from CLI's state. Pi's shadow may be stale but still useful for search/history.

Phase 2: Session Reconciliation (Deferred)

Future work could add:

  • --export-history support to detect/merge divergent sessions
  • Bidirectional sync when user switches between pi and direct CLI use
  • Conflict resolution UI for divergent histories

Implementation Notes

  1. Image support: Pi stores images as base64 in memory/session files. For CLI providers:

    • Current paste flow: write temp file → insert path → on submit, read back as base64
    • CLI provider flow: write temp file → insert path → on submit, pass @filepath to CLI directly
    • Detection: Check model.api === "external-cli" at submit time in processFileArguments or spawnCli
    • The Model interface has api field accessible via session.model?.api
  2. Error handling: Parse stderr for known patterns (rate limits, auth errors), surface raw error otherwise

  3. Streaming interruption: SIGTERM with fallback to SIGKILL after timeout (match existing piCliChat.ts approach)

  4. Session ID mapping: Use CLI session IDs directly - pi stores a reference to the CLI session ID in its shadow file header

  5. Multi-CLI sessions: Not supported in MVP - a pi session is bound to one CLI provider for its lifetime. Switching from Claude CLI to Codex CLI mid-session would lose context (the new CLI has no history). Model switching within the same CLI (e.g., Sonnet → Opus) works if the CLI supports it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment