Skip to main content
Complete guide to integrating Z.AI Coding Plan with OpenClaw AI assistant
OpenClaw is a personal AI assistant that runs on your own devices and connects to various messaging platforms. It can be configured to use Z.AI’s GLM models through the Z.AI Coding Plan.
The GLM Coding Plan supports OpenClaw, but uses a secondary scheduling and best-effort delivery strategy. Coding Agent tasks have preemption priority, and under high load, OpenClaw tasks will automatically trigger fair-use policies such as dynamic queuing and rate limiting.

Installing and Configuring OpenClaw

1

Get API Key

2

Install OpenClaw

For detailed installation guide, please refer to the official documentation

3

Setup the OpenClaw

After running the installation commands above, the configuration process will start automatically. If it doesn’t start, you can run the following command to begin configuration:
openclaw onboard --install-daemon
If you have already initialized before, you can also run openclaw config and select model configuration.DescriptionStart to Config:
  • I understand this is powerful and inherently risky. Continue? | Choose ● Yes
  • Onboarding mode | Choose ● Quick Start
  • Model/auth provider | Choose ● Z.AI
Description
4

Configure Z.AI Provider

After selecting Z.AI as the Model/auth provider, and then choose the Coding-Plan-Global
Then you will be prompted to enter your API Key. Paste your Z.AI API Key and press Enter.
Note: The models currently supported in the coding plan are GLM-5 GLM-4.7 GLM-4.5-Air GLM-4.6 GLM-4.5 GLM-4.5V GLM-4.6V. Please do not select other models to avoid unexpected charges.
Description
5

Complete Setup

Continue with the remaining OpenClaw feature configuration.
  • Select channel | Choose and configure what you need.
  • Configure skills | Choose and install what you need.
  • Finish setup
6

Interact with bot

After setup, the cli will ask you How do you want to hatch your bot?
  • Choose ● Hatch in TUI (recommended)
Now you can start chatting with your bot in Terminal UI.DescriptionOpenClaw provides more channels for you to interact with your bot, such as Web UI, Discord, Slack, etc. You can set up these channels by referring to the official documentation: Channels Setup
  • For Web UI, you can access it by opening the Web UI (with token) link shown in the terminal.
DescriptionDescription
7

After install

Verify everything is working:
openclaw doctor         # check for config issues
openclaw status         # gateway status
openclaw dashboard      # open the browser UI

For detailed configuration guide, please refer to the official documentation

OpenClaw may involve security risks if misconfigured or deployed without proper access controls. Please refer official security

Switching to GLM-5-Turbo Model

For users who are currently using OpenClaw and cannot switch to glm-5-turbo through the provider model selection method, after completing the previous zai provider setup, refer to the configuration below to switch to the new GLM-5-Turbo model
Add the glm-5-turbo model to the models.providers.zai.models array in the ~/.openclaw/openclaw.json file. Add it after the last model: Note to add a comma in the JSON format array
{
  "id": "glm-5-turbo",
  "name": "GLM-5-Turbo",
  "reasoning": true,
  "input": [
    "text"
  ],
  "cost": {
    "input": 0,
    "output": 0,
    "cacheRead": 0,
    "cacheWrite": 0
  },
  "contextWindow": 204800,
  "maxTokens": 131072
}
Modify the default model, find agents.defaults.model.primary
"primary": "zai/glm-5",
Change to
"primary": "zai/glm-5-turbo",
Find agents.defaults.models (around lines 71-76), add:
"zai/glm-5-turbo": {}
Complete modified file snippet reference below:
  1. models.providers.zai.models section:
"models": [
  {
    "id": "glm-5",
    "name": "GLM-5",
    "reasoning": true,
    "input": ["text"],
    "cost": {"input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0},
    "contextWindow": 204800,
    "maxTokens": 131072
  },
  {
    "id": "glm-4.7",
    "name": "GLM-4.7",
    "reasoning": true,
    "input": ["text"],
    "cost": {"input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0},
    "contextWindow": 204800,
    "maxTokens": 131072
  },
  {
    "id": "glm-5-turbo",
    "name": "GLM-5-Turbo",
    "reasoning": true,
    "input": ["text"],
    "cost": {"input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0},
    "contextWindow": 204800,
    "maxTokens": 131072
  }
]
  1. agents.defaults.model section:
"model": {
  "primary": "zai/glm-5-turbo",
  "fallbacks": ["zai/glm-4.7"]
}
  1. agents.defaults.models section:
"models": {
  "zai/glm-5": {"alias": "GLM"},
  "zai/glm-4.7": {},
  "zai/glm-5-turbo": {}
}
After completing the modifications, restart the gateway: openclaw gateway restart, and you can directly use the glm-5-turbo model! Execute openclaw tui in the terminal to enter the conversation and you can see that we are using the glm-5-turbo model.

Advanced Configuration

Model Failover

Configure model failover to ensure reliability: .openclaw/openclaw.json
{
  "agents": {
    "defaults": {
      "model": {
        "primary": "zai/glm-5",
        "fallbacks": ["zai/glm-4.7", "zai/glm-4.6", "zai/glm-4.5-air"]
      }
    }
  }
}  

Skills With ClawHub

A skill is just a folder with a SKILL.md file. If you want to add new capabilities to your OpenClaw agent, ClawHub is the easiest way to find and install skills.

Install the clawhub

npm i -g clawhub

Manage the Skill

Search for skills
clawhub search "postgres backups"
Download new skills
clawhub install my-skill-pack
Update installed skills
clawhub update --all

Plugins

A plugin is just a small code module that extends OpenClaw with extra features (commands, tools, and Gateway RPC).
See what’s already loaded:
openclaw plugins list
Install an official plugin (example: Voice Call):
openclaw plugins install @openclaw/voice-call
Restart the Gateway
openclaw gateway restart

Troubleshooting

Common Issues

  1. API Key Authentication
    • Ensure your Z.AI API key is valid and has the GLM Coding Plan
    • Check that the API key is properly set in the environment
  2. Model Availability
    • Verify that the GLM model is available in your region
    • Check the model name format
  3. Connection Issues
    • Ensure the OpenClaw gateway is running
    • Check network connectivity to Z.AI endpoints

Resources