Models

OpenClaw supports built-in providers and custom providers, which can be combined for optimal cost and performance. This guide also covers connecting to third-party API relay services (using AIGoCode as an example).

Built-in Providers

  • anthropic (Claude)
  • openai (GPT)
  • openrouter
  • minimax
  • zai
  • moonshot
  • ollama

Multi-Model Aliases

Register multiple models with aliases in your config:

{
  agents: {
    defaults: {
      models: {
        "anthropic/claude-opus-4-6": { alias: "opus" },
        "anthropic/claude-sonnet-4-5": { alias: "sonnet" },
        "openai/gpt-5.2": { alias: "gpt" },
      },
      model: {
        primary: "anthropic/claude-sonnet-4-5",
        fallbacks: ["minimax/MiniMax-M2.1"],
      },
    },
  },
}

Switch models in chat:

/model opus
/model gpt

The official Anthropic API has restrictions in China. Third-party relay services offer domestic access with pay-per-use pricing. AIGoCode supports Claude Opus 4.6, GPT, Codex, Gemini, and more.

Quickest Method: Let OpenClaw Configure It

In your Telegram/Feishu/Discord chat, send:

Help me configure a third-party Claude relay:
- Base URL: https://api.aigocode.com
- API Key: sk-your-key
- Model ID: claude-opus-4-6

Use the models.providers custom provider approach, set api type to anthropic-messages, then switch to this as the default model.

OpenClaw will update the config file, restart the gateway, and verify.

Manual Configuration

Open your config file:

nano ~/.openclaw/openclaw.json

Add a custom provider:

{
  "models": {
    "providers": {
      "aigocode": {
        "baseUrl": "https://api.aigocode.com",
        "apiKey": "sk-your-key",
        "api": "anthropic-messages",
        "models": [
          {
            "id": "claude-opus-4-6",
            "name": "Claude Opus 4.6 (AIGoCode)",
            "reasoning": true,
            "input": ["text", "image"],
            "contextWindow": 200000,
            "maxTokens": 16384
          }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "aigocode/claude-opus-4-6"
      }
    }
  }
}

Then restart:

openclaw gateway restart

Critical Configuration Notes

Model Typeapi Field
Claude seriesanthropic-messages
GPT / OpenAI-compatibleopenai-responses
Codex seriesopenai-responses (NOT completions)

Do NOT include /v1 in baseUrl — OpenClaw appends it automatically. Adding it creates /v1/v1 → 404.

Use custom provider format for model nameaigocode/claude-opus-4-6, NOT anthropic/claude-opus-4-6. The latter tries to use the built-in Anthropic credentials, which aren't configured → No API key found for provider "anthropic".

Common Errors

ErrorCauseFix
No API key found for provider "anthropic"Model name uses anthropic/... formatUse customProvider/modelId format
404baseUrl includes /v1Remove /v1 from baseUrl
403Key restricted to specific clientsCheck with relay provider
Unknown model: xxxWrong model ID or not declared in models arrayVerify ID spelling and models array

Token Usage

openclaw status
openclaw dashboard

Or ask your bot: "How many tokens have you used?"

China-Based Provider Config Examples

Ready-to-use minimal configurations for popular Chinese LLM providers. Copy and paste into your providers block.

Moonshot / Kimi

"moonshot": {
  "baseUrl": "https://api.moonshot.cn",
  "apiKey": "sk-your-key",
  "api": "openai-responses",
  "models": [
    { "id": "moonshot-v1-8k", "name": "Kimi 8K" },
    { "id": "moonshot-v1-32k", "name": "Kimi 32K" },
    { "id": "moonshot-v1-128k", "name": "Kimi 128K" }
  ]
}

Zhipu GLM

"zhipu": {
  "baseUrl": "https://open.bigmodel.cn/api/paas",
  "apiKey": "your-glm-key",
  "api": "openai-responses",
  "models": [
    { "id": "glm-4", "name": "GLM-4" },
    { "id": "glm-4-flash", "name": "GLM-4 Flash (free tier)" }
  ]
}

MiniMax

"minimax": {
  "baseUrl": "https://api.minimax.chat",
  "apiKey": "your-minimax-key",
  "api": "openai-responses",
  "models": [
    { "id": "MiniMax-M2.1", "name": "MiniMax M2.1" },
    { "id": "abab6.5s-chat", "name": "MiniMax ABAB 6.5S" }
  ]
}

Alibaba Qwen (Tongyi Qianwen)

"qwen": {
  "baseUrl": "https://dashscope.aliyuncs.com/compatible-mode",
  "apiKey": "sk-your-dashscope-key",
  "api": "openai-responses",
  "models": [
    { "id": "qwen-max", "name": "Qwen Max" },
    { "id": "qwen-plus", "name": "Qwen Plus" },
    { "id": "qwen-turbo", "name": "Qwen Turbo (fast)" }
  ]
}

DeepSeek

"deepseek": {
  "baseUrl": "https://api.deepseek.com",
  "apiKey": "sk-your-deepseek-key",
  "api": "openai-responses",
  "models": [
    { "id": "deepseek-chat", "name": "DeepSeek Chat" },
    { "id": "deepseek-reasoner", "name": "DeepSeek R1 (reasoning)" }
  ]
}

Tip: After configuring, switch with openclaw models set deepseek/deepseek-chat, or /model ds in chat (requires an alias configured in models.aliases).


Switching Models

openclaw models set aigocode/claude-sonnet-4-5
openclaw gateway restart