- Model ID:
minimax-m2 - Provider: MiniMax
- Strengths: Long context, efficient inference, strong code generation
- Context: 1M tokens
Add to ~/.config/opencode/config.json:
{
"provider": "openai-compatible",
"apiKey": "your-llmgateway-api-key",
"baseUrl": "https://api.llmgateway.io/v1",
"model": "minimax-m2"
}Or via environment:
export OPENCODE_PROVIDER="openai-compatible"
export OPENCODE_API_KEY="your-llmgateway-api-key"
export OPENCODE_BASE_URL="https://api.llmgateway.io/v1"
export OPENCODE_MODEL="minimax-m2"opencode --model minimax-m2 "analyze all files in this repository"- Large codebase analysis (1M context)
- Repository-wide refactoring
- Understanding legacy systems
- Multi-file code reviews
- Documentation generation for large projects
Mid-tier - excellent for large context tasks.