I Almost Used LangGraph for Social Media Automation (Here's Why I Built an MCP Server Instead)
Source: Dev.to
Introduction
I just posted my first AI‑generated tweet to X. It was generated by Groq’s free AI, processed by a custom MCP server I built, and posted through Ayrshare—all running for $0/month.
Total build time? Less than a day.
I almost used LangGraph. I studied LangChain’s 2,000‑line social‑media‑agent for days. It’s production‑ready, battle‑tested, and impressive. But I chose a different path—and saved $45/month in the process.
Why I didn’t use LangGraph
LangGraph is impressive. The LangChain social‑media‑agent repository (1.7k+ stars) offers:
- Human‑in‑the‑loop workflows with approval gates
- State checkpointing for complex processes
- Multi‑step agent orchestration
- Battle‑tested reference implementation
For social‑media automation, LangGraph provides everything: content generation, review cycles, posting workflows, and analytics—with built‑in state management.
The mismatch
My use case was much simpler:
- User provides content
- AI optimizes it for the platform
- Save as draft
- Post when ready
I didn’t need a graph; I needed a single function call. When I was halfway through setting up LangGraph Cloud, I realized I was solving the wrong problem.
LangGraph requirements
- Persistent runtime (LangGraph Cloud $29/month minimum, or a VPS)
- PostgreSQL/MongoDB for checkpointing
- Always‑on infrastructure
- Framework‑specific concepts (graphs, nodes, edges)
LangGraph can’t run on Cloudflare Workers, which rely on stateless, edge‑deployed functions with zero cold starts. Switching frameworks would also mean rewriting everything.
Why I chose the Model Context Protocol (MCP)
MCP is a protocol, not a framework. Any MCP client (Claude Desktop, VS Code with Continue, custom UIs) can use my server without modification.
Architecture assumptions
- Stateless request/response
- SQLite (Cloudflare D1) for persistence
- No cold starts
- Global distribution
Implementation snapshot
- ~800 lines of TypeScript
- 8 tools, 3 resources, 3 prompts
- Full CRUD for drafts
- AI content generation
- Multi‑platform posting
As MCP adoption grows (Anthropic, Zed, Continue, Cody), the server’s utility multiplies because it isn’t tied to a specific framework’s lifecycle.
Cost comparison
| Component | MCP approach | LangGraph approach |
|---|---|---|
| Server / runtime | Cloudflare Workers $5/month (includes D1, KV, Vectorize, R2) | LangGraph Cloud $29/month (or VPS $10‑20) |
| LLM API | Groq FREE tier | OpenAI $20‑30/month |
| Database | SQLite (D1) $0‑10/month | PostgreSQL/MongoDB $0‑10/month |
| Social‑media API | Ayrshare FREE (10 posts/month) | — |
| Total | $5/month | $50‑80/month |
For a freelancer testing ideas, the difference between $5 and $50 matters.
Architecture diagram
Claude Desktop (Client)
↓ JSON‑RPC over stdio
MCP Server (TypeScript)
├── Tools (LLM‑controlled actions)
│ ├── draft_post
│ ├── schedule_post
│ ├── post_immediately
│ ├── generate_thread
│ └── analyze_engagement
├── Resources (App‑controlled data)
│ ├── drafts://list
│ ├── scheduled://posts
│ └── stats://summary
├── Prompts (User‑invoked templates)
│ ├── write_tweet
│ ├── linkedin_post
│ └── thread_generator
├── Storage (SQLite → D1)
└── Integrations
├── Ayrshare (multi‑platform posting)
└── Groq (AI content generation)
Each layer is independent and replaceable.
Key components
Storage (SQLite → Cloudflare D1)
import Database from 'better-sqlite3';
export class StorageService {
private db: Database.Database;
constructor(dbPath: string) {
this.db = new Database(dbPath);
this.initializeSchema();
}
createDraft(data: DraftData): Draft {
const id = crypto.randomUUID();
const now = new Date().toISOString();
this.db.prepare(`
INSERT INTO drafts (id, content, platform, tone, status, created_at)
VALUES (?, ?, ?, ?, 'draft', ?)
`).run(id, data.content, data.platform, data.tone, now);
return this.getDraft(id);
}
}
Migration to Cloudflare D1
// Local development
const db = new Database('./social_media.db');
// Edge runtime
const db = env.DB; // Cloudflare D1 binding
Same SQL, different runtime.
Social‑media API (Ayrshare)
export class SocialMediaAPI {
private apiKey: string;
private baseUrl = 'https://app.ayrshare.com/api';
async postImmediately(content: string, platforms: string[]) {
const response = await fetch(`${this.baseUrl}/post`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${this.apiKey}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
post: content,
platforms,
}),
});
return response.json();
}
}
One API, ten platforms.
Content generator (Groq free tier)
import Groq from 'groq-sdk';
export class ContentGenerator {
private client: Groq;
async generate(request: GenerateRequest): Promise {
const systemPrompt = this.buildSystemPrompt(request);
const response = await this.client.chat.completions.create({
model: 'llama-3.3-70b-versatile', // FREE
messages: [
{ role: 'system', content: systemPrompt },
{ role: 'user', content: request.input },
],
max_tokens: 1000,
temperature: 0.7,
});
const text = response.choices[0]?.message?.content || '';
return this.parseResponse(text, request);
}
private buildSystemPrompt(request: GenerateRequest): string {
const platformPrompts = {
twitter: `Create engaging tweets that:
- Stay under ${request.maxLength} characters (STRICT)
- Use ${request.tone} tone
- Hook readers in the first line
- End with engagement (question or CTA)`,
linkedin: `Create professional posts that:
- Are detailed (1,300‑1,500 characters)
- Use ${request.tone} tone
- Start with a compelling hook
- Include 3‑5 key insights with takeaways`,
};
return platformPrompts[request.platform];
}
}
Quality is excellent, cost is $0. Rate limit: 30 req/min (more than enough).
MCP server skeleton
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
const server = new Server(
{ name: 'social-media-server', version: '1.0.0' },
{ capabilities: { tools: {}, resources: {}, prompts: {} } }
);
// Define tools
server.setRequestHandler(ListToolsRequestSchema, async () => ({
tools: [
{
name: 'draft_post',
// …
},
// other tools …
],
}));
The server exposes tools, resources, and prompts via a simple JSON‑RPC interface that any MCP‑compatible client can consume.
Takeaway
By opting for a lightweight Model Context Protocol server deployed on Cloudflare Workers, I:
- Eliminated the need for persistent runtimes and complex state management
- Reduced monthly costs from ~$50 to $5
- Delivered a fully functional social‑media automation pipeline in under a day
If your AI workflow is straightforward—generate, review, and act—a protocol‑first approach can be far more economical and agile than adopting a heavyweight framework like LangGraph.