← Back to AI Hub

🚀 OpenClaw Agent Setup

System Requirements

OpenClaw runs on Linux (arm64, amd64, armv7) and macOS. Minimum:

Installation

Linux (ARM64 / Raspberry Pi 5, Orange Pi)

# Update system
sudo apt update && sudo apt upgrade -y

# Install Node.js 22+
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
sudo apt install -y nodejs

# Install OpenClaw
npm install -g openclaw

# Initialize agent
openclaw init --non-interactive
openclaw pairing approve telegram CODE_HERE

# Start
openclaw agent run

Linux (AMD64 / x86 desktop, server)

# Update system
sudo apt update && sudo apt upgrade -y

# Install Node.js 22+
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
sudo apt install -y nodejs

# Install OpenClaw
npm install -g openclaw

# Initialize agent
openclaw init --non-interactive
openclaw pairing approve telegram CODE_HERE

# Start
openclaw agent run

macOS

# Install Homebrew if needed
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

# Install Node.js
brew install node@22

# Install OpenClaw
npm install -g openclaw

# Initialize agent
openclaw init --non-interactive
openclaw pairing approve telegram CODE_HERE

# Start
openclaw agent run

Configuration

Edit ~/.openclaw/openclaw.json:

{
  "agent": {
    "name": "MyAgent",
    "model": "anthropic/claude-sonnet-4-5"
  },
  "channels": {
    "telegram": { "enabled": true }
  },
  "plugins": {
    "entries": {
      "telegram": { "enabled": true }
    }
  }
}

Model Selection & Local Inference

OpenClaw can use both cloud APIs (Anthropic Claude, OpenAI) and local models (Ollama, vLLM).

Cloud Models (Default)

Local Models

For cost savings (85% reduction), run local Ollama inference:

# Install Ollama
curl https://ollama.ai/install.sh | sh

# Download a model
ollama pull qwen2.5:7b

# OpenClaw will auto-detect on localhost:11434

Hardware & Model Recommendations

See Local LLM Inference Guide for 20+ hardware tiers (Raspberry Pi to H100) with:

Configure Local Inference in OpenClaw

Edit openclaw.json:

{
  "agent": {
    "name": "MyAgent",
    "model": "ollama/qwen2.5:7b"  ← Switch to local model
  },
  "providers": {
    "ollama": {
      "endpoint": "http://localhost:11434",
      "fallback": "anthropic/claude-sonnet-4-5"  ← Cloud fallback if Ollama down
    }
  }
}

Local Model Setup (Optional)

For cost savings, use local Ollama inference:

# Install Ollama
curl https://ollama.ai/install.sh | sh

# Run locally
ollama run qwen2.5:7b

# Configure OpenClaw to use local inference
# See: Local LLM Inference guide in AI Hub

Troubleshooting

← Back to AI Hub