Build AI-powered React apps using Genkit

Published: (December 7, 2025 at 01:27 AM EST)
5 min read
Source: Dev.to

Source: Dev.to

Introduction

I’ve spent the last few months working with various AI frameworks, and Genkit has become my go‑to for production applications. It’s Google’s open‑source framework for building AI workflows, and it solves many of the headaches that arise when wiring up LLMs to real applications.

If you’ve tried building AI features into production apps, you’ve probably run into the same issues:

  • Prompt spaghetti – prompts start as simple strings, then grow into multi‑line templates with context injection, and eventually end up scattered across services with no clear ownership.
  • Debugging black holes – something goes wrong in the AI pipeline. Was it the prompt? The model? The context you passed in? Without proper observability, tracking down the issue is a nightmare.
  • Integration mess – you need to call an external API, pass that data to an LLM, maybe do some post‑processing, then return a structured response. This quickly becomes ugly without a clean abstraction.

Genkit addresses these problems by giving you a flow primitive—think of it as a function that can contain multiple steps (API calls, LLM invocations, transformations) with built‑in logging and a dev UI for inspection.

We’ll create a city analyser that:

  1. Takes a city name from the frontend.
  2. Fetches real weather data from an external API.
  3. Passes that data to Gemini for analysis.
  4. Returns both raw data and AI‑generated insights.

Nothing revolutionary, but it demonstrates the pattern you’ll use for most real‑world AI features.

Backend Setup

mkdir genkit-backend && cd genkit-backend
npm init -y
npm install @genkit-ai/core @genkit-ai/flow @genkit-ai/google-ai express cors
npm install -D typescript ts-node @types/node @types/express

Flow Implementation (src/flows/analyzeCity.ts)

// src/flows/analyzeCity.ts
import { flow } from '@genkit-ai/flow';
import { genkit } from '@genkit-ai/core';
import { googleAI } from '@genkit-ai/google-ai';

const ai = genkit({
  plugins: [googleAI()],
});

interface WeatherData {
  temp: number;
  humidity: number;
  wind_speed: number;
}

async function fetchWeather(city: string): Promise<WeatherData> {
  const response = await fetch(
    `https://api.api-ninjas.com/v1/weather?city=${encodeURIComponent(city)}`,
    {
      headers: { 'X-Api-Key': process.env.WEATHER_API_KEY! },
    }
  );

  if (!response.ok) {
    throw new Error(`Weather API failed: ${response.status}`);
  }

  return response.json();
}

export const analyzeCityFlow = flow(
  'analyzeCityFlow',
  async ({ city }: { city: string }) => {
    // Step 1: Get external data
    const weather = await fetchWeather(city);

    // Step 2: Generate AI analysis
    const result = await ai.generate({
      model: 'google/gemini-pro',
      prompt: `Given the current weather in ${city}:
- Temperature: ${weather.temp}°C
- Humidity: ${weather.humidity}%
- Wind Speed: ${weather.wind_speed} km/h

Write 2‑3 sentences describing what it's like there right now and any practical suggestions for someone visiting today.`,
    });

    return {
      city,
      weather,
      analysis: result.text(),
    };
  }
);

Express Server (src/server.ts)

// src/server.ts
import express from 'express';
import cors from 'cors';
import { analyzeCityFlow } from './flows/analyzeCity';

const app = express();
app.use(cors());
app.use(express.json());

app.post('/api/analyze-city', async (req, res) => {
  try {
    const result = await analyzeCityFlow(req.body);
    res.json(result);
  } catch (error) {
    console.error('Flow failed:', error);
    res.status(500).json({ error: 'Analysis failed' });
  }
});

const PORT = process.env.PORT || 4000;
app.listen(PORT, () => {
  console.log(`Server running on port ${PORT}`);
});

You can test flows in isolation using:

npx genkit dev

This spins up a local UI where you can invoke flows with test inputs and inspect each step—saving hours of debugging.

Frontend Setup

npm create vite@latest genkit-react -- --template react-ts
cd genkit-react
npm install

React App (src/App.tsx)

// src/App.tsx
import { useState } from 'react';

interface AnalysisResult {
  city: string;
  weather: {
    temp: number;
    humidity: number;
    wind_speed: number;
  };
  analysis: string;
}

function App() {
  const [city, setCity] = useState('');
  const [result, setResult] = useState<AnalysisResult | null>(null);
  const [loading, setLoading] = useState(false);
  const [error, setError] = useState<string | null>(null);

  const handleAnalyze = async () => {
    if (!city.trim()) return;

    setLoading(true);
    setError(null);

    try {
      const response = await fetch('http://localhost:4000/api/analyze-city', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ city }),
      });

      if (!response.ok) throw new Error('Request failed');

      setResult(await response.json());
    } catch {
      setError('Failed to analyze city. Check if the backend is running.');
    } finally {
      setLoading(false);
    }
  };

  return (
    <div style={{ maxWidth: 600, margin: '0 auto', padding: 20 }}>
      <h2>City Weather Analyzer</h2>

      <div style={{ display: 'flex', gap: 8, marginBottom: 12 }}>
        <input
          value={city}
          onChange={e => setCity(e.target.value)}
          placeholder="Enter a city name"
          onKeyDown={e => e.key === 'Enter' && handleAnalyze()}
          style={{ flex: 1, padding: 8 }}
        />
        <button onClick={handleAnalyze} disabled={loading}>
          {loading ? 'Analyzing...' : 'Analyze'}
        </button>
      </div>

      {error && <p style={{ color: 'red' }}>{error}</p>}

      {result && (
        <div>
          <h3>{result.city}</h3>
          <p>{result.analysis}</p>
          <pre>{JSON.stringify(result.weather, null, 2)}</pre>
        </div>
      )}
    </div>
  );
}

export default App;

Why This Setup Works

Separation of Concerns

The React app knows nothing about LLMs or external APIs—it simply calls an endpoint and renders the response. Swapping Gemini for GPT‑4 or changing the data source requires no frontend changes.

Testability

Flows are just async functions. They can be unit‑tested, mocked, and verified independently of the UI.

Observability Out of the Box

The Genkit dev UI shows exactly what went into each step and what came out, making production debugging feasible.

Incremental Complexity

Start with a single‑step flow and add steps as needed (caching, moderation, etc.). The architecture scales with your requirements.

Where to Go From Here

This example barely scratches the surface. In a production setting you’d likely add:

  • Input validation and rate limiting
  • Caching for external API calls
  • Streaming responses for better UX
  • Authentication
  • Robust error handling with retries

Genkit also supports retrieval‑augmented generation (RAG), evaluations, and deployment to Firebase or Cloud Run. See the official Genkit documentation for detailed guidance.

If you’re building AI features into production apps and are tired of ad‑hoc solutions, give Genkit a serious look. The learning curve is minimal for anyone comfortable with TypeScript, and the structure it provides pays dividends as your AI logic grows more complex.

Back to Blog

Related posts

Read more »

New Bloggg

What is Lorem Ipsum? Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever si...