Settings

Language

OpenClaw: Run Your Own AI Assistant on Any Server

L
LemonData
ยทFebruary 26, 2026ยท3 views
#openclaw#self-hosted#ai-assistant#telegram#discord
OpenClaw: Run Your Own AI Assistant on Any Server

OpenClaw: Run Your Own AI Assistant on Any Server

Cloud AI assistants are convenient until they're not. Rate limits during peak hours. Data leaving your network. Monthly subscriptions that add up. No way to customize behavior beyond what the provider allows.

OpenClaw is a self-hosted AI assistant that runs on your own hardware. It connects to Telegram, Discord, or any chat platform, uses any AI model through a unified API, and keeps all conversation data on your machine.

What OpenClaw Does

At its core, OpenClaw is a gateway between chat platforms and AI models. You send a message on Telegram, OpenClaw routes it to your chosen AI model, and sends the response back.

But it goes further than a simple relay:

  • Multi-model support: Switch between GPT-4.1, Claude, DeepSeek, and local models mid-conversation
  • Persistent memory: Conversations persist across restarts with configurable context windows
  • MCP server support: Connect to external tools (databases, APIs, file systems) through the Model Context Protocol
  • Plugin system: Add custom commands, scheduled tasks, and integrations
  • Multi-user: Each user gets their own conversation history and model preferences
  • Image understanding: Send photos and get AI analysis (using vision-capable models)
  • Voice messages: Speech-to-text processing for voice inputs

Architecture

Telegram/Discord โ†โ†’ OpenClaw Gateway โ†โ†’ AI API (LemonData/OpenAI/Local)
                         โ”‚
                    โ”Œโ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”
                    โ”‚  Plugins โ”‚
                    โ”‚  MCP     โ”‚
                    โ”‚  Memory  โ”‚
                    โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

OpenClaw runs as a single Node.js process. No database required for basic usage (conversations stored as JSON files). For production deployments, it supports persistent volumes on Kubernetes.

Quick Start (5 Minutes)

Option 1: Docker (Recommended)

# Create config directory
mkdir -p ~/.openclaw

# Create minimal config
cat > ~/.openclaw/openclaw.json << 'EOF'
{
  "api": {
    "key": "sk-lemon-xxx",
    "baseUrl": "https://api.lemondata.cc/v1"
  },
  "telegram": {
    "token": "YOUR_TELEGRAM_BOT_TOKEN"
  },
  "agents": {
    "defaults": {
      "model": "claude-sonnet-4-6"
    }
  }
}
EOF

# Run
docker run -d \
  --name openclaw \
  -v ~/.openclaw:/root/.openclaw \
  ghcr.io/hedging8563/lemondata-openclaw:latest

Option 2: Direct Install

# Clone and install
git clone https://github.com/hedging8563/openclaw.git
cd openclaw
npm install

# Configure (edit ~/.openclaw/openclaw.json)
# Run
node src/index.js

Option 3: LemonData Hosted

If you don't want to manage infrastructure, LemonData offers hosted OpenClaw instances. Each instance runs in an isolated Kubernetes pod with persistent storage.

Sign up at lemondata.cc, navigate to the Claw section in your dashboard, and launch an instance. You get a dedicated subdomain (claw-yourname.lemondata.cc) with web terminal access.

Configuration

The config file (~/.openclaw/openclaw.json) controls everything:

{
  "api": {
    "key": "sk-lemon-xxx",
    "baseUrl": "https://api.lemondata.cc/v1"
  },
  "telegram": {
    "token": "BOT_TOKEN_FROM_BOTFATHER"
  },
  "discord": {
    "token": "DISCORD_BOT_TOKEN"
  },
  "agents": {
    "defaults": {
      "model": "claude-sonnet-4-6",
      "compaction": { "mode": "default" }
    }
  }
}

Model Selection

Switch models per-conversation or set defaults:

/model claude-sonnet-4-6    # Switch to Claude
/model gpt-4.1-mini         # Switch to GPT-4.1 Mini (cheaper)
/model deepseek-chat         # Switch to DeepSeek (budget)

MCP Servers

Connect external tools through MCP (Model Context Protocol):

{
  "mcp": {
    "servers": {
      "filesystem": {
        "command": "npx",
        "args": ["-y", "@anthropic/mcp-filesystem", "/path/to/allowed/dir"]
      },
      "postgres": {
        "command": "npx",
        "args": ["-y", "@anthropic/mcp-postgres", "postgresql://..."]
      }
    }
  }
}

With MCP servers configured, your AI assistant can read files, query databases, and interact with external services directly from the chat interface.

Use Cases

Personal Knowledge Assistant

Connect OpenClaw to your notes directory via MCP filesystem server. Ask questions about your own documents, get summaries, find connections between notes.

Team DevOps Bot

Deploy in your team's Slack or Discord. Connect to your Kubernetes cluster, monitoring dashboards, and CI/CD pipelines. Team members can check deployment status, view logs, and trigger rollbacks through natural language.

Customer Support Automation

Connect to your product database and knowledge base. OpenClaw handles first-line support queries, escalating to humans when confidence is low.

Code Review Assistant

Connect to your Git repository. Send diffs for review, get security analysis, style suggestions, and bug detection without leaving your chat app.

Cost Comparison

Setup Monthly Cost Models Data Privacy
ChatGPT Plus $20/user GPT-4o, limited Data on OpenAI servers
Claude Pro $20/user Claude only Data on Anthropic servers
OpenClaw (self-hosted) API usage only Any model Data on your server
OpenClaw (LemonData hosted) $20/instance + API Any model Isolated K8s pod

For a team of 5, ChatGPT Plus costs $100/month with limited model access. OpenClaw with shared API credits might cost $30-50/month total, with access to every model and full data control.

Hardware Requirements

  • Minimum: Any machine with Node.js 18+ and 512MB RAM
  • Recommended: 1 CPU core, 1GB RAM, 10GB storage
  • For local models (Ollama): Add GPU/Apple Silicon requirements per model

OpenClaw itself is lightweight. The AI inference happens on the API provider's servers (or your local Ollama instance).


Try OpenClaw: Self-host with any AI API, or launch a hosted instance at lemondata.cc. $1 free API credit on signup.

Share: