Back to Catalog
MicroClaw logo
official

MicroClaw

Rust-first agentic AI runtime that turns any chat surface into a powerful execution environment. One core, full tool use, durable memory, scheduler, and MCP federation across Telegram, Discord, Slack, and 10+ more platforms.

Overview

MicroClaw is an open-source, Rust-built agentic AI assistant runtime that lives directly in your chats. Inspired by NanoClaw, it uses a channel-agnostic core with platform adapters so you deploy one shared agent logic across every messaging surface without forking code. It excels at turning conversations into reliable execution surfaces with full tool chaining, persistent memory, resumable sessions, background scheduling, and external skill federation via MCP (Model Context Protocol).

Built as a single binary with SQLite persistence, it is designed for teams and power users who need stable, observable automation that survives model or channel changes.

Key Features

  • Channel-Agnostic Core: Ingest normalizes events; Deliver handles per-platform limits. Currently supports Telegram, Discord, Slack, Feishu/Lark, Matrix, WhatsApp, iMessage, Email, Nostr, Signal, DingTalk, QQ, IRC, and Web.
  • Tool-Using Agent Loop: Multi-step reasoning with bash, file read/write/edit, glob, grep, web search/fetch (DuckDuckGo), sub-agents, todo planning, and send_message.
  • Durable Memory: AGENTS.md files (global/bot/chat scope) + optional SQLite structured memory with semantic KNN search (sqlite-vec feature) and reflector extraction.
  • Scheduler & Background Tasks: Cron and one-shot tasks run inside the same runtime.
  • MCP + Skills Federation: Attach external tool servers (Playwright browser automation, Peekaboo macOS automation, filesystem, etc.) and Anthropic-compatible skills without rewriting the core.
  • LLM Support: Native Anthropic + any OpenAI-compatible endpoint (OpenAI, OpenRouter, Ollama, Grok/xAI, DeepSeek, Google, Azure, Bedrock, and dozens more).
  • Observability: Local web UI at http://127.0.0.1:10961 for cross-channel history, usage stats, and memory inspection. HTTP API triggers with API keys.
  • Operational Extras: Context compaction, mid-conversation messaging, group mention catch-up, continuous typing indicators, permission-aware tools.

Installation

Recommended (macOS / Linux)

curl -fsSL https://microclaw.ai/install.sh | bash

Homebrew (macOS)

brew tap microclaw/tap
brew install microclaw

Windows

iwr https://microclaw.ai/install.ps1 -UseBasicParsing | iex

From Source

git clone https://github.com/microclaw/microclaw.git
cd microclaw
cargo build --release
# Optional semantic memory: cargo build --release --features sqlite-vec

After install run microclaw doctor then microclaw setup (interactive wizard) and microclaw start.

Quickstart Commands

  • microclaw doctor – diagnostics
  • microclaw setup – configure bot tokens & LLM keys
  • microclaw start – launch the runtime
  • microclaw upgrade – update to latest
  • Local UI: http://127.0.0.1:10961

Use Cases

  • Personal Infra Agent: Shell/file tooling + memory + scheduler across all your chats.
  • Team Operations Bot: Permission-aware tools and shared history for internal workflows.
  • Product Prototyping Runtime: Ship new channels or tools on the stable Rust core instead of fragmented bots.

Architecture (5-Step Runtime Loop)

  1. Ingest – channel adapters normalize messages/events.
  2. Assemble Context – inject session state, AGENTS.md, SQLite memory, and active skills.
  3. Reason + Tool Calls – stream LLM response and execute tools in a controlled loop until end_turn.
  4. Persist + Reflect – save conversations; reflector updates durable facts.
  5. Deliver – split responses per channel limits and emit consistently.

License & Links

MicroClaw is actively maintained with frequent releases and focuses on stability, hackability, and real-world automation.

Tags

rustai-agentchat-automationtelegramdiscordslackmcptoolsself-hostedllmmemoryscheduler