Models
OpenClaw supports built-in providers and custom providers, which can be combined for optimal cost and performance. This guide also covers connecting to third-party API relay services (using AIGoCode as an example).
Built-in Providers
anthropic(Claude)openai(GPT)openrouterminimaxzaimoonshotollama
Multi-Model Aliases
Register multiple models with aliases in your config:
{
agents: {
defaults: {
models: {
"anthropic/claude-opus-4-6": { alias: "opus" },
"anthropic/claude-sonnet-4-5": { alias: "sonnet" },
"openai/gpt-5.2": { alias: "gpt" },
},
model: {
primary: "anthropic/claude-sonnet-4-5",
fallbacks: ["minimax/MiniMax-M2.1"],
},
},
},
}
Switch models in chat:
/model opus
/model gpt
Third-Party API Relay (Recommended for China)
The official Anthropic API has restrictions in China. Third-party relay services offer domestic access with pay-per-use pricing. AIGoCode supports Claude Opus 4.6, GPT, Codex, Gemini, and more.
Quickest Method: Let OpenClaw Configure It
In your Telegram/Feishu/Discord chat, send:
Help me configure a third-party Claude relay:
- Base URL: https://api.aigocode.com
- API Key: sk-your-key
- Model ID: claude-opus-4-6
Use the models.providers custom provider approach, set api type to anthropic-messages, then switch to this as the default model.
OpenClaw will update the config file, restart the gateway, and verify.
Manual Configuration
Open your config file:
nano ~/.openclaw/openclaw.json
Add a custom provider:
{
"models": {
"providers": {
"aigocode": {
"baseUrl": "https://api.aigocode.com",
"apiKey": "sk-your-key",
"api": "anthropic-messages",
"models": [
{
"id": "claude-opus-4-6",
"name": "Claude Opus 4.6 (AIGoCode)",
"reasoning": true,
"input": ["text", "image"],
"contextWindow": 200000,
"maxTokens": 16384
}
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "aigocode/claude-opus-4-6"
}
}
}
}
Then restart:
openclaw gateway restart
Critical Configuration Notes
| Model Type | api Field |
|---|---|
| Claude series | anthropic-messages |
| GPT / OpenAI-compatible | openai-responses |
| Codex series | openai-responses (NOT completions) |
Do NOT include /v1 in baseUrl — OpenClaw appends it automatically. Adding it creates /v1/v1 → 404.
Use custom provider format for model name — aigocode/claude-opus-4-6, NOT anthropic/claude-opus-4-6. The latter tries to use the built-in Anthropic credentials, which aren't configured → No API key found for provider "anthropic".
Common Errors
| Error | Cause | Fix |
|---|---|---|
No API key found for provider "anthropic" | Model name uses anthropic/... format | Use customProvider/modelId format |
| 404 | baseUrl includes /v1 | Remove /v1 from baseUrl |
| 403 | Key restricted to specific clients | Check with relay provider |
Unknown model: xxx | Wrong model ID or not declared in models array | Verify ID spelling and models array |
Token Usage
openclaw status
openclaw dashboard
Or ask your bot: "How many tokens have you used?"
China-Based Provider Config Examples
Ready-to-use minimal configurations for popular Chinese LLM providers. Copy and paste into your providers block.
Moonshot / Kimi
"moonshot": {
"baseUrl": "https://api.moonshot.cn",
"apiKey": "sk-your-key",
"api": "openai-responses",
"models": [
{ "id": "moonshot-v1-8k", "name": "Kimi 8K" },
{ "id": "moonshot-v1-32k", "name": "Kimi 32K" },
{ "id": "moonshot-v1-128k", "name": "Kimi 128K" }
]
}
Zhipu GLM
"zhipu": {
"baseUrl": "https://open.bigmodel.cn/api/paas",
"apiKey": "your-glm-key",
"api": "openai-responses",
"models": [
{ "id": "glm-4", "name": "GLM-4" },
{ "id": "glm-4-flash", "name": "GLM-4 Flash (free tier)" }
]
}
MiniMax
"minimax": {
"baseUrl": "https://api.minimax.chat",
"apiKey": "your-minimax-key",
"api": "openai-responses",
"models": [
{ "id": "MiniMax-M2.1", "name": "MiniMax M2.1" },
{ "id": "abab6.5s-chat", "name": "MiniMax ABAB 6.5S" }
]
}
Alibaba Qwen (Tongyi Qianwen)
"qwen": {
"baseUrl": "https://dashscope.aliyuncs.com/compatible-mode",
"apiKey": "sk-your-dashscope-key",
"api": "openai-responses",
"models": [
{ "id": "qwen-max", "name": "Qwen Max" },
{ "id": "qwen-plus", "name": "Qwen Plus" },
{ "id": "qwen-turbo", "name": "Qwen Turbo (fast)" }
]
}
DeepSeek
"deepseek": {
"baseUrl": "https://api.deepseek.com",
"apiKey": "sk-your-deepseek-key",
"api": "openai-responses",
"models": [
{ "id": "deepseek-chat", "name": "DeepSeek Chat" },
{ "id": "deepseek-reasoner", "name": "DeepSeek R1 (reasoning)" }
]
}
Tip: After configuring, switch with
openclaw models set deepseek/deepseek-chat, or/model dsin chat (requires an alias configured inmodels.aliases).
Switching Models
openclaw models set aigocode/claude-sonnet-4-5
openclaw gateway restart