Zero-Friction Publishing: A Human-in-the-Loop Agentic CMS powered by Notion MCP

Published: (March 10, 2026 at 04:48 AM EDT)
6 min read
Source: Dev.to

Source: Dev.to

This is a submission for the Notion MCP Challenge

What I Built

Me: I intend to submit to the Notion MCP challenge. Will the workflow below be a good application?

Claude: This is a strong submission concept! Let me break down how it stacks up against the judging criteria…

Me: Can you draft my submission using the template below and store it to the Notion database using the filename notion-mcp-challenge.md?

Claude: First, let me search for the publishing database in your Notion workspace to store this in…

(Claude uses Notion MCP to connect to the database)

Done! The initial draft has been saved directly to your Notion CMS.

This actual conversation is how this entire project—and the very article you are reading—started.

While AI can generate content instantly, maintaining high quality, personal voice, and strict formatting requires a “Human‑in‑the‑loop” approach. I built a Zero‑Friction Agentic Publishing Pipeline that turns Notion into a collaborative headless CMS between a human creator and an AI orchestrator.

Instead of juggling multiple tabs, manually converting formats, or dealing with Git commands, this workflow lets me focus entirely on directing and editing. I simply converse with my AI agent (Claude), and it handles the heavy lifting of database management, file transformation, and version control via Notion MCP.

Architecture

Workflow Overview

The Process

PhaseDescription
Director Phase (1)I initiate the idea via natural language.
Drafting Phase (2)The AI orchestrator generates the initial draft and saves it directly to a structured Notion database (mapping properties like title, filename, and metadata) using Notion MCP.
Refinement Phase (3)I jump into Notion—the best UI for writing—and refine the draft, adding my personal touch.
Publishing Pipeline (4‑6)I tell the AI to finalize it. It fetches the updated content from Notion via MCP, transforms it into Markdown with YAML front‑matter, and creates a Pull Request on GitHub.
Approval (7‑9)I review the PR diff and hit Merge, triggering GitHub Actions to deploy the article live to dev.to.

Video Demo

This short presentation (generated via NotebookLM based on my initial drafts) explains the philosophy behind this setup. As highlighted in the video, this is Conversation‑driven development. You’ll see how I can orchestrate a complex Notion‑to‑GitHub pipeline without writing a single line of traditional code—relying entirely on the workflow overview and natural‑language prompts.


Show Us the Code

The absolute beauty of this Agentic Workflow is that it requires zero traditional middleware code. By leveraging standard MCP servers, the “code” shifts from writing brittle API wrappers to defining CI/CD pipelines and database schemas.

1. Repository & CI/CD Pipeline

The GitHub repository is the central hub where the AI and I collaborate.

  • AI Opens PRs: Claude, via the GitHub MCP server, dynamically opens Pull Requests and pushes updates based on the Notion drafts.
  • Human Approval: I review the Markdown and YAML front‑matter in the PR diff. This is the crucial “Human‑in‑the‑loop” checkpoint.
  • Automated Deployment: Once I hit Merge, a GitHub Action automatically triggers and publishes the article to dev.to.

Explore the actual repository powering this workflow:

👉 (link omitted in original)

2. Notion Schema (The “Data Model” for Front‑Matter)

When Claude opens a Pull Request, it extracts the draft from Notion and transforms it into a standard Markdown file. During this process, it generates the YAML front‑matter using the exact properties stored in the Notion database. This strict schema is the secret sauce that allows the AI to act predictably.

Notion Schema

PropertyDescription
titleThe main headline of your post.
publishedBoolean to control visibility.
descriptionUsed for SEO and dev.to’s summary.
tagsAutomates categorization.
organization_usernamePublishes under a specific dev.to organization (used by the Publish to Dev.to Organization Action).
canonical_urlMaintains SEO integrity for cross‑posted content.
cover_imageURL for the article header image.
filenameExact ID for the .md file in the GitHub repo.
github_branchBranch that the AI should target for the PR.
ContentThe page body – the shared canvas for AI generation and human editing.

3. Overcoming the “Stringification Hell” (Forcing Self‑Verification)

During testing, I hit a physical limit of current LLMs: the “Stringification Hell.” When Claude tried to pass the final, complex Markdown (with YAML, Mermaid diagrams, and newlines) to the GitHub MCP, it struggled to perfectly escape the massive JSON payload. To avoid syntax errors, I forced the model to:

  1. Export the Markdown to a temporary Notion page – letting the model treat it as plain text.
  2. Read the page back via MCP – guaranteeing proper JSON encoding.
  3. Submit the retrieved content to GitHub – now safely escaped.

This extra round‑trip added a tiny latency but eliminated malformed PR bodies, proving that a small “self‑verification” step can dramatically improve reliability.

TL;DR

  • Zero‑code Agentic Publishing Pipeline – conversation‑driven, human‑in‑the‑loop.
  • Notion MCP stores drafts, metadata, and version history.
  • GitHub MCP + Actions turn Notion pages into Markdown + YAML, open PRs, and publish to dev.to.
  • Schema‑driven front‑matter guarantees deterministic AI behavior.
  • Self‑verification (read‑back) solves the “Stringification Hell” problem.

I hope this submission demonstrates a novel, reproducible workflow that pushes the boundaries of what can be achieved with Notion MCP and modern LLM orchestration. Good luck to everyone participating!

How I Used Notion MCP

Notion MCP is the absolute core of this Agentic Workflow. It acts as the critical bridge that transforms Notion from a passive knowledge base into an active, collaborative workspace for AI and humans.

MCP Write: I utilized the MCP to allow the LLM to dynamically create pages within my “Publishing CMS” database. It maps its generated ideas perfectly to the title, filename, and all metadata properties needed for production.

MCP Read: After human intervention (my edits), the LLM uses MCP to read the exact Notion page, ensuring that the final output perfectly preserves human nuance before transforming it into strict Markdown/YAML for developers.

What it unlocks:

This integration completely eliminates context switching and cognitive load. By leveraging Notion MCP, I don’t have to copy‑paste between a chat window, a text editor, and a terminal. The AI orchestrates the mundane system operations (formatting, API calls, Git commands), while I retain 100 % creative control inside Notion’s beautiful editing environment. This is the ultimate human‑in‑the‑loop scaling system for any modern creator or developer!

0 views
Back to Blog

Related posts

Read more »