Building DevRel Copilot: A Technical Deep Dive into AI-Powered Developer Relations

Published: (March 18, 2026 at 01:29 PM EDT)
3 min read
Source: Dev.to

Source: Dev.to

Introduction

Developer Relations (DevRel) blends engineering, content creation, and community building. To streamline the workflow of writing blog posts, planning events, and building demos, we created DevRel Copilot—a command center that leverages Google’s Gemini 3.1 Pro to generate structured, actionable strategies.

Technical Architecture

The application is built with React 19 and Vite, emphasizing a clean, modular structure.

  • State Management – Centralized in App.tsx.
  • AI Logic – Encapsulated in a dedicated library (src/lib).

Multi‑Provider Support

While Gemini remains the default, users can plug in their own keys for OpenAI, OpenRouter, or local Ollama instances. The openai SDK provides a unified interface for all non‑Gemini models.

// src/lib/aiClient.ts
function getAIClient(settings: AppSettings) {
  const provider = settings.activeProvider;
  if (provider === 'gemini') {
    return new GoogleGenAI({
      apiKey: settings.providers.gemini.apiKey || process.env.GEMINI_API_KEY,
    });
  }
  return new OpenAI({
    apiKey: settings.providers[provider].apiKey,
    baseURL: settings.providers[provider].baseUrl,
    dangerouslyAllowBrowser: true,
  });
}

Structured Responses with JSON Schema

A major challenge with LLMs is obtaining consistent, machine‑readable data. We solved this by using the responseSchema feature in the @google/genai SDK, ensuring the model always returns a valid JSON object that matches the UI’s expectations.

// src/lib/gemini.ts
const schemas = {
  content: {
    type: Type.OBJECT,
    properties: {
      contentPlan: { type: Type.STRING, description: "High-level content plan" },
      twitterThread: { type: Type.ARRAY, items: { type: Type.STRING } },
      // ... more properties
    },
    required: ["contentPlan", "twitterThread" /*, ...*/],
  },
  // ... other module schemas
};

By defining these schemas, the AI’s response can be directly mapped to React components without complex parsing or error‑prone regex.

Regenerating Specific Sections

A key UX feature is the ability to regenerate only a chosen part of the strategy (e.g., just the Twitter thread) while keeping the rest intact.

// src/lib/regeneration.ts
export async function regenerateSection(
  params: GenerateParams,
  sectionKey: string,
  existingData: any
) {
  const prompt = `
    ... (Product Context) ...
    Existing Strategy: ${JSON.stringify(existingData, null, 2)}
    Task: Regenerate ONLY the "${sectionKey}" section. Provide a new, improved version.
  `;
  // ... API call with a schema that includes only the requested sectionKey
}

This “contextual regeneration” maintains consistency across the strategy while delivering fresh ideas for the targeted section.

UI Design

We aimed for a professional‑grade tool rather than a simple chatbot, using Tailwind CSS and Motion for a sleek, responsive interface.

  • Fixed Sidebar – Allows instant switching between modules (Content, Events, Demos).
  • State‑Driven Panel – Swaps OutputPanel content based on the active tab.

Input Validation & Loading State

To ensure high‑quality AI outputs, the “Generate” button is disabled until core product details are provided. A custom loading state with a progress indicator manages user expectations during the 5‑10 second generation window.

// src/components/InputPanel.tsx
const handleGenerateClick = () => {
  if (!data.product_name.trim() || !data.product_description.trim()) {
    setShowValidation(true);
    return;
  }
  onGenerate();
};

Key Takeaways

  • Prompt Engineering – The system prompt (“You are an expert Developer Relations strategist…”) sets the right tone and avoids generic marketing fluff.
  • JSON Schema – Using responseSchema eliminated ~90 % of parsing errors typical with LLM outputs.
  • Contextual Regeneration – Supplying the existing strategy during regeneration markedly improves consistency.

DevRel Copilot is more than a wrapper around an LLM; it’s a purpose‑built tool that understands the specific needs of DevRel professionals. By combining structured AI outputs with a modern React frontend, we’ve created a powerful MVP that can be easily extended with new modules and features.


Source code: (replace with the actual repository URL)

0 views
Back to Blog

Related posts

Read more »