{
"amp.model.sonnet": true
}This specifically switches Smart mode from Gemini 3 Pro Preview to Claude Sonnet 4.5.
{
"amp.internal.model": "anthropic:claude-sonnet-4-5-20250929"
}This lets you pick any model using provider:model-name format.
Note: internal.model takes precedence if both settings are configured.
| Setting Value | Display Name | Context | Max Output |
|---|---|---|---|
anthropic:claude-sonnet-4-5-20250929 |
Claude Sonnet 4.5 | 200K | 64K |
anthropic:claude-sonnet-4-20250514 |
Claude Sonnet 4 | 200K | 32K |
anthropic:claude-opus-4-1-20250805 |
Claude Opus 4.1 | 200K | 32K |
anthropic:claude-opus-4-20250514 |
Claude Opus 4 | 200K | 32K |
anthropic:claude-haiku-4-5-20251001 |
Claude Haiku 4.5 | 200K | 64K |
anthropic:claude-3-5-sonnet-20241022 |
Claude 3.5 Sonnet | 200K | 8K |
anthropic:claude-3-5-haiku-20241022 |
Claude 3.5 Haiku | 200K | 8K |
| Setting Value | Display Name | Context | Max Output |
|---|---|---|---|
vertexai:gemini-3-pro-preview |
Gemini 3 Pro Preview | 1M | 65K |
vertexai:gemini-2.5-pro |
Gemini 2.5 Pro | 1M | 65K |
vertexai:gemini-2.5-flash |
Gemini 2.5 Flash | 1M | 65K |
vertexai:gemini-2.5-flash-lite |
Gemini 2.5 Flash Lite | 1M | 65K |
vertexai:gemini-2.5-flash-preview-09-2025 |
Gemini 2.5 Flash (Preview) | 1M | 65K |
vertexai:gemini-2.5-flash-lite-preview-09-2025 |
Gemini 2.5 Flash Lite (Preview) | 1M | 65K |
Note: Amp uses an internal registry (
Wm4) for OpenAI model routing. Models not in this registry will silently fall back togpt-5.1due to prefix matching.
Available models:
| Setting Value | Display Name | Context | Max Output |
|---|---|---|---|
openai:gpt-5.1 |
GPT-5.1 | 400K | 128K |
openai:gpt-5-codex |
GPT-5 Codex | 400K | 128K |
openai:gpt-5-mini |
GPT-5 Mini | 400K | 128K |
openai:gpt-5-nano |
GPT-5 Nano | 400K | 128K |
openai:gpt-4.1 |
GPT-4.1 | 1M | 32K |
openai:gpt-4.1-mini |
GPT-4.1 Mini | - | - |
openai:gpt-4.1-nano |
GPT-4.1 Nano | - | - |
openai:gpt-4o |
GPT-4o | 128K | 16K |
openai:gpt-4o-mini |
GPT-4o Mini | - | - |
openai:o3 |
o3 | 200K | - |
openai:o3-mini |
o3-mini | 200K | - |
openai:o3-deep-research |
o3 Deep Research | - | - |
openai:o4-mini-deep-research |
o4 Mini Deep Research | - | - |
Not available (will fall back to gpt-5.1):
| Setting Value | Fallback |
|---|---|
openai:gpt-5.1-codex |
gpt-5.1 |
openai:openai/gpt-oss-120b |
gpt-5.1 |
| Setting Value | Display Name | Context | Max Output |
|---|---|---|---|
xai:grok-code |
Grok Code | 256K | 32K |
xai:grok-code-fast-1 |
Grok Code Fast 1 | 256K | 32K |
| Setting Value | Display Name | Context | Max Output |
|---|---|---|---|
moonshotai:kimi-k2-instruct-0905 |
Kimi K2 Instruct | - | - |
Note: OpenRouter supports passthrough to any model available on their platform. The model ID after
openrouter:is passed directly to OpenRouter's API.
Pre-defined model:
| Setting Value | Display Name | Context | Max Output |
|---|---|---|---|
openrouter:sonoma-sky-alpha |
Sonoma Sky Alpha | 256K | 32K |
Passthrough to any OpenRouter model:
You can use any model available on OpenRouter by specifying its full model ID:
{
"amp.internal.model": "openrouter:anthropic/claude-3-opus-20240229"
}Examples:
openrouter:anthropic/claude-3-opus-20240229openrouter:meta-llama/llama-3.1-405b-instructopenrouter:google/gemini-pro-1.5openrouter:mistralai/mixtral-8x22b-instruct
Requires: amp.openrouter.apiKey setting or OPENROUTER_API_KEY environment variable.
| Setting Value | Display Name | Context | Max Output |
|---|---|---|---|
cerebras:zai-glm-4.6 |
Z.ai GLM 4.6 | 131K | 40K |
| Mode | Default Model |
|---|---|
| Smart | vertexai:gemini-3-pro-preview |
| Free | anthropic:claude-haiku-4-5-20251001 |
| Dojo | anthropic:claude-sonnet-4-5-20250929 |
| Dojo (Free) | anthropic:claude-haiku-4-5-20251001 |