From Chat App to AI Powerhouse: Telegram + OpenClaw
Source: Dev.to
Overview
In this guide we’ll walk through how to install OpenClaw on macOS and connect it to Telegram step‑by‑step.
OpenClaw is a self‑hosted AI assistant that runs locally while integrating with messaging platforms such as Telegram, WhatsApp, Discord, Slack, and more. Unlike typical cloud chatbots, OpenClaw gives you full control over:
- The model you use
- Workspaces and sessions
- Integrations and automation
By the end of this tutorial you will have:
- OpenClaw installed on macOS
- An AI model configured (using OpenAI in this example)
- A Telegram bot connected and approved
- The gateway service running in the background
- The web dashboard and TUI working
- A fully‑functioning Telegram AI assistant
1. Install OpenClaw
OpenClaw provides a modern installer that automatically detects your OS and installs required dependencies (including Node.js if needed).
curl -fsSL https://openclaw.ai/install.sh | bash
What the installer does
- Detects your operating system
- Verifies Homebrew (macOS)
- Checks Node.js installation
- Installs OpenClaw globally via npm
- Prepares your environment
Sample installation output
OpenClaw Installer
Detected: macOS
Install method: npm
Requested version: latest
[1/3] Preparing environment
✓ Homebrew already installed
✓ Node.js v24.6.0 found
[2/3] Installing OpenClaw
✓ Git already installed
INFO Installing OpenClaw v2026.2.15
✓ OpenClaw npm package installed
✓ OpenClaw installed
[3/3] Finalizing setup
OpenClaw installed successfully (2026.2.15)!
Installation complete. Your productivity is about to get weird.
INFO Starting setup
OpenClaw 2026.2.15
This confirms:
- All dependencies are ready
- The latest OpenClaw version is installed
- Setup has successfully initialized
After installation, OpenClaw automatically launches its onboarding process, showing a Security Warning screen. OpenClaw is not a simple chatbot; it is an autonomous AI assistant capable of:
- Reading local files
- Executing system actions
- Running tools & automations
- Connecting to external services
Warning: A bad prompt can trick it into unsafe actions. This is expected for agent‑style systems.
2. Choose a Model / Auth Provider
OpenClaw will prompt you to select a model/backend. It supports many providers, including:
- Anthropic
- OpenAI
- OpenRouter
- LiteLLM
- Amazon Bedrock
- Vercel AI Gateway
- Moonshot AI
- MiniMax
- OpenCode Zen
- GLM Models
- Z.AI
- Synthetic
- Qianfan
You can use:
- Direct API providers
- Gateway aggregators
- Enterprise cloud backends
- Local or proxy‑based routing
For this tutorial we’ll use OpenAI
- Select OpenAI from the list.
- Enter your OpenAI API key when prompted.
Enter OpenAI API key:
After entering a valid key, OpenClaw verifies the connection and configures a default model.
Saved OPENAI_API_KEY to ~/.openclaw/.env
Model configured
Default model set to openai/gpt-5.1-codex
What this means
- Your API key is securely stored in
~/.openclaw/.env. - The connection to OpenAI succeeded.
- A default model is selected (you can change it later).
Choose a specific model
OpenClaw lets you keep the default, enter a custom model, or pick from the list. For OpenAI you might see:
openai/gpt-5.1-codexopenai/codex-mini-latestopenai/gpt-4openai/gpt-4-turboopenai/gpt-4.1openai/gpt-4.1-miniopenai/gpt-4o- …and more
We’ll use openai/gpt-4o for this guide.
3. Select a Chat Channel
OpenClaw now asks which channel you want to enable.
Select channel (QuickStart)
● Telegram (Bot API)
○ WhatsApp (QR link)
○ Discord (Bot API)
○ Slack (Socket Mode)
...
Choose Telegram (Bot API).
4. Create a Telegram Bot & Get the Token
- Open Telegram and start a chat with @BotFather.
- Run
/newbot(or/mybotsif you already have bots). - Follow the prompts to give your bot a name and a username (must end in
bot).
/newbot
BotFather will return a token that looks like:
123456789:AAExampleGeneratedToken
Add the token to OpenClaw
When OpenClaw prompts:
Enter Telegram bot token:
paste the token.
Tip: For automated deployments you can set the token as an environment variable or add it to the
.envfile:
export TELEGRAM_BOT_TOKEN=123456789:AAExampleGeneratedToken
# or add to ~/.openclaw/.env
5. Finalize Telegram Configuration
OpenClaw validates the token and updates its configuration.
Selected channels
Telegram — simplest way to get started — register a bot with @BotFather and get going.
Updated ~/.openclaw/openclaw.json
Workspace OK: ~/.openclaw/workspace
Sessions OK: ~/.openclaw/agents/main/sessions
What this confirms
- Telegram channel is successfully configured.
openclaw.jsonhas been updated.- Workspace directory and session storage are ready.
At this point your assistant is fully wired to Telegram.
6. Verify Everything Is Running
- Start the OpenClaw gateway (if not already running):
openclaw start
- Open the web dashboard (usually at
http://localhost:3000) or launch the TUI:
openclaw tui
- In Telegram, search for your bot (the username you set) and send a message like:
/help
You should receive a response from the OpenClaw AI assistant.
🎉 You’re Done!
You now have:
- OpenClaw installed on macOS
- OpenAI configured as the LLM backend
- A Telegram bot linked and ready to chat
Feel free to explore other channels, add more agents, or switch to a different model provider. Happy hacking!
OpenClaw Setup Overview
Configuration Files & Directories
~/.openclaw/openclaw.json → Main configuration
~/.openclaw/workspace → Working directory
~/.openclaw/agents/main/sessions → Conversation memory & state
This is where your agent:
- Stores sessions
- Maintains chat history
- Tracks tool execution
- Manages runtime state
Everything is now persistent and structured.
Skills Report
| Status | Count |
|---|---|
| Eligible | 7 |
| Missing requirements | 42 |
| Unsupported on this OS | 0 |
| Blocked by allowlist | 0 |
- Eligible → Skills ready to run
- Missing requirements → Optional skills needing extra dependencies
- Unsupported on this OS → OS‑incompatible tools
- Blocked by allowlist → Restricted skills
This doesn’t mean something is broken — it simply reflects optional capabilities not yet installed. You can install additional skills later as needed.
Gateway Service Installation
After completing the configuration, OpenClaw installs and starts the Gateway service. You’ll see output similar to:
Installing Gateway service....
Gateway service installed.
Telegram: ok (@yourbotname)
Agents: main (default)
Heartbeat interval: 30m
Session store: ~/.openclaw/agents/main/sessions/sessions.json
OpenClaw has now:
- Installed the Gateway service (runs in the background)
- Created a macOS LaunchAgent
- Linked your Telegram bot successfully
- Initialized the default agent
- Set up persistent session storage
The important line is:
Telegram: ok
That means your bot is live.
What the Gateway Does
- Manages chat channels
- Handles agent sessions
- Maintains WebSocket connections
- Routes messages between Telegram and your model
- Stores session history
- Runs continuously in the background
On macOS it installs as a LaunchAgent:
~/Library/LaunchAgents/ai.openclaw.gateway.plist
Logs are stored at:
~/.openclaw/logs/gateway.log
Launching the TUI
When you see:
How do you want to hatch your bot?
Hatch in TUI (recommended)
Just press Enter. This runs:
openclaw tui --ws://127.0.0.1:18789 --agent main --session main
You’ll then see:
connected | idle
agent main | session main
openai/gpt-4o
That means:
- ✅ Gateway running
- ✅ Model connected
- ✅ Session active
- ✅ Bot alive
You can now type:
Hello
and it will respond.
Web Dashboard
Open the OpenClaw WebUI (Dashboard) in your browser:
http://127.0.0.1:18789
If a token was generated during setup, use it:
http://127.0.0.1:18789/#token=YOUR_TOKEN
(Replace YOUR_TOKEN with the exact token shown in your terminal.)
You should see:
- Status: Connected
- Health: OK
- Chat interface ready
Navigate to Chat (left sidebar), type a message in the input box, and press Enter or click Send.
Example test message
What is 12 × 14?
If it replies, your model, gateway, and agent are fully working.
Approving Telegram Access
Run this command to approve Telegram pairing:
openclaw pairing approve telegram your_code
Then open Telegram and send a message (e.g., “hello”) to your bot.
If it replies, your Telegram bot is successfully connected and working.
Recap of the Walkthrough
- Installed OpenClaw on macOS
- Configured the model provider and API key
- Selected a default model
- Connected Telegram using BotFather
- Installed and started the Gateway service
- Launched the TUI and Web Dashboard
- Approved Telegram pairing
- Successfully tested the bot
Your assistant is now live and running locally. Messages sent to your Telegram bot are routed through the OpenClaw Gateway, processed by your selected model, and returned in real time.
Next Steps
- Add more skills
- Connect additional channels (WhatsApp, Discord, Slack)
- Enable automation workflows
- Integrate tools like Notion, GitHub, or file access
- Run deeper security audits
OpenClaw turns your local machine into a personal AI control center. Now it’s your move — start building, automating, and experimenting.
Exclusive Offer: Free Azure Credits
We’re excited to offer an exclusive opportunity for our community. You are eligible to receive free Azure credits worth $1,000 to explore, test, and deploy any TechLatest product or solution.
- Use case examples: building AI agents, experimenting with automation tools, scaling infrastructure, etc.
- Limited‑time, first‑come, first‑served – don’t miss out!
Claim your credits:
https://www.techlatest.net/free_azure_credits/
Explore our products and solutions here:
https://techlatest.net/support/
Stay Connected
- Website: https://www.techlatest.net/
- Newsletter: https://substack.com/@techlatest
- Twitter: https://twitter.com/TechlatestNet
- LinkedIn: https://www.linkedin.com/in/techlatest-net/
- YouTube: https://www.youtube.com/@techlatest_net/
- Blogs: https://medium.com/@techlatest.net
- Reddit Community: https://www.reddit.com/user/techlatest_net/
Like | Follow | Subscribe to stay updated!