Setting Up Your Platform Prereqs - Build AI Platforms From Scratch #2

Published: (December 12, 2025 at 04:25 PM EST)
4 min read
Source: Dev.to

Source: Dev.to

šŸ“ŗ View this module with video & slides

Quick Reminder

  • This course isn’t about writing code. A code repository in the External Resources section contains common patterns for calling third‑party LLM APIs, parsing responses, counting tokens, and assessing cost.
  • For a more focused learning environment, take the course on nicholasmgoldstein.com/courses.
  • The course is free; you can support the creator by liking, subscribing, or checking out AI platforms like Emstrata (emstrata.com).
  • Feedback is welcome—comment with topics you’d like covered more thoroughly or any unclear points.

Picking an LLM

Choosing early prevents later paralysis.

  • LLM Strengths and Learning: Different models have distinct strengths and textual voices. The best way to learn them is by iterating on system prompts rather than relying solely on benchmarks.
  • Recommended LLMs: Personal favorite is Claude/Anthropic (versions 3.7–4.0). ChatGPT/OpenAI and Gemini/Google are also solid options. All typically require a credit card; see each provider’s pricing page for details.
  • Multi‑LLM Strategy: Use multiple models for different tasks. For example, Claude 4.0 excels at heavy analysis and rule‑following, while smaller models like Mistral handle simpler tasks, saving money while staying effective.
  • Additional Tools: OpenAI and Google often bundle built‑in tools such as text‑to‑speech, transcription, and image generation, which Anthropic may lack.

The Choice is Yours

API workbench links

  • Anthropic –
  • OpenAI –
  • Google –
  • Mistral –

Anatomy of an AI Request

How building in the workbench looks:

  • Prompt Library System: Most workbenches let you save prompts for testing. Adopt a naming convention for high‑volume usage.
  • Model Selection: Click the model name (e.g., claude-sonnet-4-5…) to switch between providers.
  • System Prompt: The static instruction sent to the LLM; defines behavior and context.
  • User Request: The data sent for a specific instance (the term ā€œUserā€ can be misleading outside chatbot contexts).
  • Expected Response: The API’s LLM output, which you can test and iterate on before integrating.

Intermediate Techniques

System Prompt Variables & Message Pairs

  • System Prompt Variables: Wrap camelCase strings in {{...}} to create variables inside your system prompt. Useful for customization and modularity.
  • Message Pairs: Commonly used for chat history, they add context to a request and maintain conversation continuity across interactions.

Note: The author prefers to systematize user requests rather than rely heavily on prompt variables. Both approaches have merit.

Thoughts on Frontend/Backend

  • Security Warning: Never expose API keys on the frontend. All third‑party LLM calls should be made from the backend to protect your financial accounts. Set proper limits and keep keys secret.
  • Tech Stack (example):
    • Frontend – TypeScript + Vite
    • Mobile – Flutter
    • Backend – Go + Gin
  • AI‑Powered IDEs: Tools like Cursor dramatically boost productivity, especially for non‑coders.

The course won’t focus on code, but concrete steps for a basic frontend and backend are provided for you to iterate on.

Learn While Building

  • AI can help newcomers learn coding patterns, but avoid ā€œvibe‑coding.ā€ Understand what the AI writes and why it works.
  • In Cursor, ask the agent to ā€œAdd comments explaining what each line of code does in depth. I’m new to coding.ā€ Then read those comments to deepen your understanding.

Frontend Setup

# TypeScript/Vite
npm create vite@latest my-project-name -- --template vanilla-ts
cd my-project-name
npm install
npm run dev

For detailed instructions, see setup-instructions.txt in the code repository.

Backend Setup

# Go/Gin
mkdir my-project-name
cd my-project-name
go mod init github.com/yourusername/my-project-name
go get -u github.com/gin-gonic/gin

Additional Go setup details are in setup-instructions.txt.

Database Setup

# PostgreSQL & GORM
# Install PostgreSQL first, then:
createdb mydb

# In your Go project:
go get -u gorm.io/gorm
go get -u gorm.io/driver/postgres

Mobile App Setup

# Flutter
flutter create my_project_name
cd my_project_name
flutter run

Install the Flutter SDK.

The Main Takeaway

The purpose of this setup is to provide a ready‑to‑use environment—frontend, backend, database, and mobile—so you can focus on building. Even if you’re new to coding, you don’t need to master every line now; AI‑powered IDEs will generate code, while your role is to understand, oversee, troubleshoot, and architect the system.

Continue Learning

External Resources

  • Emstrata – Platform for immersive narrative experiences using AI‑generated storylines.
  • PLATO5 – Social engine that turns online connections into real‑world friendships with AI‑enhanced conversations.
Back to Blog

Related posts

Read more Ā»

Guardrail your LLMs

!Forem Logohttps://media2.dev.to/dynamic/image/width=65,height=,fit=scale-down,gravity=auto,format=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%...