How I Built a Multi-Platform AI Bot with Langflow's Drag-and-Drop Workflows

Published: (December 5, 2025 at 08:51 AM EST)
2 min read
Source: Dev.to

Source: Dev.to

Overview

Drive chatbots across QQ, WeChat, Telegram, Discord, and more using visual workflows – no coding required.

LangBot is an open‑source instant‑messaging bot platform that connects AI workflow engines like Langflow, n8n, Dify, FastGPT, and Coze to platforms including WeChat, QQ, Feishu, DingTalk, Telegram, Discord, Slack, and LINE. This tutorial demonstrates how to use Langflow’s visual workflows as LangBot’s conversation engine.

Why This Approach Works

  • True Multi‑Platform – One workflow powers 8+ messaging platforms simultaneously.
  • Visual Orchestration – Drag‑and‑drop conversation design with conditional branches, multi‑turn dialogs, and external API calls.
  • Flexible AI Models – Supports OpenAI, Claude, Gemini, DeepSeek, and local models.
  • Fully Open Source – Both LangBot and Langflow are open‑source projects for free deployment and customization.

Prerequisites

  • Python 3.10+
  • Docker (recommended for quick deployment)
  • OpenAI API key or API keys for other LLM services

Step 1: Deploy LangBot

Launch with uvx in one command:

uvx langbot

The first run auto‑initializes and opens your browser to .

LangBot Initial Page

After registration, log in to access the dashboard:

LangBot Dashboard

Step 2: Deploy Langflow

Deploy quickly with Docker:

docker run -d --name langflow -p 7860:7860 langflowai/langflow:latest

Visit to access Langflow.

Langflow Welcome Page

Step 3: Create a Langflow Workflow

In Langflow, select the Basic Prompting template to get started quickly:

Langflow Template Selection

The template includes four basic components:

  • Chat Input – Receives user messages.
  • Prompt – Sets system instructions.
  • Language Model – Calls the LLM to generate responses.
  • Chat Output – Returns results.

Langflow Workflow Editor

Configure Language Model

Click the Language Model component and set:

  • Model Provider – Select OpenAI (or other compatible providers such as SiliconFlow, New API).
  • Model Name – e.g., gpt-4o-mini or deepseek-chat.
  • OpenAI API Key – Enter your key.

Langflow OpenAI API Key Configured

Tip: You can use OpenAI‑compatible services by modifying the Base URL.

Save the workflow after configuration.

Step 4: Get Langflow API Information

Generate API Key

In Langflow’s upper‑right menu go to Settings → API Keys:

Langflow API Keys Page

Click Create New Key, then generate and copy the key (format: sk-xxxxxxxxxxxxxxxxxxxxxxxx).

Langflow Create API Key Dialog

Langflow API Key Generated

Get Flow ID

Open the flow editor; the URL contains the flow ID:

http://localhost:7860/flow/{flow-id}

Record the {flow-id} value.

Step 5: Configure Langflow in LangBot

In the LangBot dashboard, navigate to Pipelines and edit the ChatPipeline. In the AI tab:

LangBot Pipeline AI Tab

  • Set Runner to Langflow API.

LangBot Runner Dropdown

  • Fill in the Langflow configuration fields:

    • Base URLhttp://localhost:7860/api/v1/flow
    • API Key – the key generated in Step 4.
    • Flow ID – the {flow-id} recorded earlier.

LangBot Langflow Config Form

Save the pipeline. LangBot will now route incoming messages from all configured platforms through the Langflow workflow, enabling a unified, visual AI chatbot across multiple services.

Back to Blog

Related posts

Read more »

The Modem's Revenge

The First Connection In the winter of 1994, in a small apartment in Hong Kong, a fourteen‑year‑old boy plugged a US Robotics Sportster 14,400 Fax Modem into hi...