Deploy Nano Banana MCP Service in 5 Minutes: Integrate AI Image Generation into Your Workflow

Published: (December 5, 2025 at 01:44 AM EST)
3 min read
Source: Dev.to

Source: Dev.to

What is MCP and Why Should You Care?

Model Context Protocol (MCP) is an open protocol that enables AI assistants to securely access external tools and data sources. It provides a standardized way for AI tools to talk to your services.

genai-mcp is an open‑source MCP server that wraps Google Gemini and compatible backends into a standardized HTTP endpoint. It provides:

  • Image generation from text prompts
  • Image editing with natural language instructions
  • S3/OSS integration for automatic image storage
  • Streamable HTTP transport compatible with major MCP clients
  • Multiple backend support (Google Gemini official API, third‑party Gemini‑compatible backends like Nano Banana)
  • Multiple AI model support, including Tongyi Wanxiang 2.5 (faster and more cost‑effective)

Prerequisites

  • Go 1.21+ (if building from source)
  • A Nano Banana API key (or Google Gemini API key, or Tongyi Wanxiang API key)
  • Optional: S3/OSS bucket for image storage

Step 1: Quick Deployment

Download the binary for your platform from the Releases page:

# macOS (Apple Silicon)
wget https://github.com/adamydwang/genai-mcp/releases/download/release%2F0.2/genai-mcp.darwin.arm64
chmod +x genai-mcp.darwin.arm64
mv genai-mcp.darwin.arm64 genai-mcp

# Linux
wget https://github.com/adamydwang/genai-mcp/releases/download/release%2F0.2/genai-mcp.linux.amd64
chmod +x genai-mcp.linux.amd64
mv genai-mcp.linux.amd64 genai-mcp

Create a configuration file:

# Clone the repo to get env.example
git clone https://github.com/adamydwang/genai-mcp.git
cd genai-mcp
cp env.example .env

Option B: Build from Source

git clone https://github.com/adamydwang/genai-mcp.git
cd genai-mcp
go build .

Step 2: Configure Your Backend

genai-mcp supports multiple backends. Choose one based on your needs.

Option 1: Nano Banana (Third‑party Gemini‑compatible Backend)

Edit your .env file:

# Use Gemini provider (works with Nano Banana and other compatible backends)
GENAI_PROVIDER=gemini

# Point to your Nano Banana endpoint
GENAI_BASE_URL=https://your-nano-banana-endpoint.com

# Your Nano Banana API key
GENAI_API_KEY=your_nano_banana_api_key_here

# Model name (adjust based on your Nano Banana setup)
GENAI_GEN_MODEL_NAME=gemini-3-pro-image-preview
GENAI_EDIT_MODEL_NAME=gemini-3-pro-image-preview

# Request timeout (seconds)
GENAI_TIMEOUT_SECONDS=120

# Image output format: 'base64' or 'url'
GENAI_IMAGE_FORMAT=url

Option 2: Google Gemini Official API

GENAI_PROVIDER=gemini
GENAI_BASE_URL=https://generativelanguage.googleapis.com
GENAI_API_KEY=your_google_gemini_api_key_here
GENAI_GEN_MODEL_NAME=gemini-3-pro-image-preview
GENAI_EDIT_MODEL_NAME=gemini-3-pro-image-preview
GENAI_TIMEOUT_SECONDS=120
GENAI_IMAGE_FORMAT=url

Option 3: Tongyi Wanxiang 2.5 (Faster & More Cost‑Effective)

# Use Wan provider
GENAI_PROVIDER=wan

# Tongyi Wanxiang endpoint
GENAI_BASE_URL=https://dashscope.aliyuncs.com

# Your DashScope API key
GENAI_API_KEY=your_dashscope_api_key_here

# Tongyi Wanxiang model names
GENAI_GEN_MODEL_NAME=wan2.5-t2i-preview
GENAI_EDIT_MODEL_NAME=wan2.5-i2i-preview

GENAI_TIMEOUT_SECONDS=120
GENAI_IMAGE_FORMAT=url

Server & OSS Configuration

Add these to your .env:

# Server configuration
SERVER_ADDRESS=0.0.0.0
SERVER_PORT=8080

# OSS/S3 Configuration (required if GENAI_IMAGE_FORMAT=url)
OSS_ENDPOINT=oss-cn-beijing.aliyuncs.com   # or your S3 endpoint
OSS_REGION=us-east-1
OSS_ACCESS_KEY=your_access_key
OSS_SECRET_KEY=your_secret_key
OSS_BUCKET=your_bucket_name

Step 3: Start the Server

./genai-mcp

You should see output similar to:

INFO Starting GenAI MCP Server
INFO Server configuration loaded
INFO MCP server starting address=0.0.0.0:8080/mcp

The MCP endpoint is now available at http://localhost:8080/mcp.

Step 4: Test the Server

The project includes Python test scripts. First, install dependencies:

cd tests
pip install -r requirements.txt

Test the connection:

# List available tools
python test_list_tools.py

# Generate an image
python test_generate_image.py "A futuristic cityscape at sunset"

# Edit an image
python test_edit_image.py "Make it more vibrant" "https://example.com/image.jpg"

Step 5: Integrate into Your Workflow Tools

Integration with Cursor

Add the MCP server to your Cursor settings:

{
  "mcpServers": {
    "genai-mcp": {
      "url": "http://localhost:8080/mcp",
      "transport": "http"
    }
  }
}

Integration with Claude Desktop

Edit Claude Desktop’s config (e.g., ~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "genai-mcp": {
      "command": "curl",
      "args": [
        "-X", "POST",
        "http://localhost:8080/mcp"
      ],
      "env": {}
    }
  }
}

Integration with Dify

In Dify, add a custom tool:

  • Name: Gemini Image Generator
  • Method: POST
  • URL: http://your-server:8080/mcp
  • Headers: Content-Type: application/json
  • Body:
{
  "jsonrpc": "2.0",
  "id": 1,
  "method": "tools/call",
  "params": {
    "name": "gemini_generate_image",
    "arguments": {
      "prompt": "{{prompt}}"
    }
  }
}

Integration with n8n

Create an HTTP Request node:

  • Method: POST
  • URL: http://localhost:8080/mcp
  • Headers:
{
  "Content-Type": "application/json"
}
  • Body (JSON):
{
  "jsonrpc": "2.0",
  "id": 1,
  "method": "tools/call",
  "params": {
    "name": "gemini_generate_image",
    "arguments": {
      "prompt": "{{ $json.prompt }}"
    }
  }
}

Integration with Custom Applications

Simple Python example:

import requests
import json

def generate_image(prompt: str, mcp_url: str = "http://localhost:8080/mcp"):
    """Generate an image using the MCP server."""
    payload = {
        "jsonrpc": "2.0",
        "id": 1,
        "method": "tools/call",
        "params": {
            "name": "gemini_generate_image",
            "arguments": {"prompt": prompt}
        }
    }

    response = requests.post(mcp_url, json=payload)
    response.raise_for_status()
    return response.json()
Back to Blog

Related posts

Read more »

AI-Powered Development Platform

🤔 The Problem That Kept Me Up at Night Picture this: You discover an awesome open‑source project on GitHub. It has 10,000+ issues, hundreds of contributors, a...