How I Built a Multi-Platform AI Bot with Langflow's Drag-and-Drop Workflows
Source: Dev.to
Overview
Drive chatbots across QQ, WeChat, Telegram, Discord, and more using visual workflows – no coding required.
LangBot is an open‑source instant‑messaging bot platform that connects AI workflow engines like Langflow, n8n, Dify, FastGPT, and Coze to platforms including WeChat, QQ, Feishu, DingTalk, Telegram, Discord, Slack, and LINE. This tutorial demonstrates how to use Langflow’s visual workflows as LangBot’s conversation engine.
Why This Approach Works
- True Multi‑Platform – One workflow powers 8+ messaging platforms simultaneously.
- Visual Orchestration – Drag‑and‑drop conversation design with conditional branches, multi‑turn dialogs, and external API calls.
- Flexible AI Models – Supports OpenAI, Claude, Gemini, DeepSeek, and local models.
- Fully Open Source – Both LangBot and Langflow are open‑source projects for free deployment and customization.
Prerequisites
- Python 3.10+
- Docker (recommended for quick deployment)
- OpenAI API key or API keys for other LLM services
Step 1: Deploy LangBot
Launch with uvx in one command:
uvx langbot
The first run auto‑initializes and opens your browser to .

After registration, log in to access the dashboard:

Step 2: Deploy Langflow
Deploy quickly with Docker:
docker run -d --name langflow -p 7860:7860 langflowai/langflow:latest
Visit to access Langflow.

Step 3: Create a Langflow Workflow
In Langflow, select the Basic Prompting template to get started quickly:

The template includes four basic components:
- Chat Input – Receives user messages.
- Prompt – Sets system instructions.
- Language Model – Calls the LLM to generate responses.
- Chat Output – Returns results.

Configure Language Model
Click the Language Model component and set:
- Model Provider – Select OpenAI (or other compatible providers such as SiliconFlow, New API).
- Model Name – e.g.,
gpt-4o-miniordeepseek-chat. - OpenAI API Key – Enter your key.

Tip: You can use OpenAI‑compatible services by modifying the Base URL.
Save the workflow after configuration.
Step 4: Get Langflow API Information
Generate API Key
In Langflow’s upper‑right menu go to Settings → API Keys:

Click Create New Key, then generate and copy the key (format: sk-xxxxxxxxxxxxxxxxxxxxxxxx).


Get Flow ID
Open the flow editor; the URL contains the flow ID:
http://localhost:7860/flow/{flow-id}
Record the {flow-id} value.
Step 5: Configure Langflow in LangBot
In the LangBot dashboard, navigate to Pipelines and edit the ChatPipeline. In the AI tab:

- Set Runner to Langflow API.

-
Fill in the Langflow configuration fields:
- Base URL –
http://localhost:7860/api/v1/flow - API Key – the key generated in Step 4.
- Flow ID – the
{flow-id}recorded earlier.
- Base URL –

Save the pipeline. LangBot will now route incoming messages from all configured platforms through the Langflow workflow, enabling a unified, visual AI chatbot across multiple services.