Skip to content

Instantly share code, notes, and snippets.

@steebchen
Created January 25, 2026 17:28
Show Gist options
  • Select an option

  • Save steebchen/2211bc48c14e0ff669e9f20a5eeaec49 to your computer and use it in GitHub Desktop.

Select an option

Save steebchen/2211bc48c14e0ff669e9f20a5eeaec49 to your computer and use it in GitHub Desktop.
DeepSeek Chat with OpenCode via LLM Gateway

DeepSeek Chat with OpenCode via LLM Gateway

Model Overview

  • Model ID: deepseek-chat
  • Provider: DeepSeek
  • Strengths: Cost-effective, strong coding capability, open-weights model
  • Context: 64K tokens

Configuration

Add to ~/.config/opencode/config.json:

{
  "provider": "openai-compatible",
  "apiKey": "your-llmgateway-api-key",
  "baseUrl": "https://api.llmgateway.io/v1",
  "model": "deepseek-chat"
}

Or via environment:

export OPENCODE_PROVIDER="openai-compatible"
export OPENCODE_API_KEY="your-llmgateway-api-key"
export OPENCODE_BASE_URL="https://api.llmgateway.io/v1"
export OPENCODE_MODEL="deepseek-chat"

Usage

opencode --model deepseek-chat "add error handling to this function"

Best Use Cases

  • Day-to-day coding tasks on a budget
  • Code generation and completion
  • Bug fixes and simple refactoring
  • Learning and experimentation
  • High-volume coding assistance

Pricing Tier

Budget - excellent value for standard coding tasks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment