DEV Track Spotlight: Build real-time full-stack Generative AI applications (DEV310)
Source: Dev.to
Introduction
“This is the only slide that we have,” announced Salih Gueler as he kicked off DEV310, the final session of re:Invent’s first day. What followed was pure live coding: building a complete full‑stack Generative AI application from scratch in under an hour.
Salih Gueler (Senior Developer Advocate, AWS) and Matt Goldberg (Senior Principal Engineer, AWS Developer Tools) demonstrated a modern developer workflow using Kiro (an agentic IDE), deploying with AWS CDK, integrating Amazon Bedrock, and leveraging the newly announced AWS MCP server.
The application was a gaming platform inspired by Salih’s personal need: “I play a lot of games and one thing that I was missing was a platform to exchange games or sell games or see what my friends played. I love Goodreads… I wanted to have a version of that for games.”
The Evolution of AI‑Assisted Development
Before diving into the demo, Salih reflected on how rapidly AI‑assisted development has evolved.
- Two years ago – Developers marveled at autocomplete creating simple functions like “two plus two” in Java.
- Last year (2024) – Developers copied and pasted entire applications from ChatGPT or Claude.
- Today (2025) – We’re living in the age of agents. AI systems can build, deploy, and iterate on applications with minimal human intervention.
“Agents are as smart as you tell them to be. If I just sit down here and say build me a game platform, they will probably build the ugliest and most unstructured app you will ever see. But if you tell more and more information, they’ll understand what you mean.”
Building the Front End with Kiro
Starting with Context
Salih began by creating a React application with Vite, but immediately addressed a common challenge: large language models have cutoff dates and may not know the latest library versions.
Solution: Model Context Protocol (MCP) servers. He added the Context7 MCP server to provide up‑to‑date documentation for Shadcn UI, which wasn’t in the LLMs’ knowledge base.
The prompt was simple but specific:
“Create a React application with Vite, use Tailwind CSS for styling, and leverage Shadcn UI for components.”
Kiro fetched the latest Shadcn UI docs via Context7 and built the project structure.
The Power of Steering Files
One of Kiro’s most valuable features is steering files—rules that define boundaries for the application. Salih’s steering files included:
- React projects: Always ensure
npm run buildruns successfully. - TypeScript projects: Never use the
anytype.
These guardrails keep the agent aligned with best practices and catch errors early. When an error occurred, the steering file flagged it immediately, and Kiro automatically fixed it.
Iterative Development
Salih broke the development into small, manageable steps:
- Create the basic React project.
- Add CSS and UI components.
- Create TypeScript interfaces and mock data.
- Build layout components (header, footer, responsive design).
- Create authentication page.
- Add AI chat functionality.
- Style and polish the UI.
Each iteration took roughly 2–3 minutes. The key insight: break prompts into small steps rather than asking the agent to do everything at once.
Describing What You Want
When creating the authentication page, Salih didn’t just say “create a login page.” He provided detailed guidance:
- Center the card layout.
- Use specific colors for the signup button.
- Include email and password fields.
- Ensure responsive design.
“Everything that I explained here… is coming from what I have envisioned.”
The more specific the vision, the better the agent can execute it.
Deploying to AWS with CDK
The MCP Server Approach
Matt Goldberg took over to demonstrate deployment using his custom MCP server. Instead of breaking deployment into many tiny steps, the MCP server returned a comprehensive prompt that guided the agent through a full deployment workflow, including a checklist:
- Make the code base agent‑friendly.
- Generate AWS CDK infrastructure.
- Deploy to AWS.
- Set up a CI/CD pipeline (for production).
Best Practice #1 – Agent‑Friendly Code Base
Matt created an agents.md file to give guidance to AI agents working on the project, e.g.:
- References to design documentation in
docs/design/component.md. - Git commit message guidelines.
- Skills for common tasks (like finding logs in CloudWatch).
This documentation helps agents understand project context and diagnose issues independently. When errors occur during deployment, the agent knows to check CloudWatch logs rather than guessing.
Best Practice #2 – Infrastructure as Code
Using AWS CDK ensures the agent has complete context about existing infrastructure. Without IaC, an agent might add resources via the AWS CLI, leaving the code base unaware of those resources.
The generated CDK code included:
- S3 bucket for front‑end assets.
- CloudFront distribution for content delivery.
- Proper IAM resource policies.
- Tags for organization and cost tracking.
Best Practice #3 – Deterministic Deployment Scripts
Matt created a deploy.sh script to guarantee consistent deployments across environments. A simplified version:
#!/usr/bin/env bash
if [ "$ENVIRONMENT" = "preview" ]; then
# Use hot‑swap fallback for faster local development
cdk deploy --hotswap-fallback
else
# Use full CloudFormation deployment for production
cdk deploy
fi
- Hot‑swap fallback updates Lambda functions and S3 buckets directly without waiting for a full CloudFormation deployment—ideal for rapid local iteration.
- For production, a full CloudFormation deployment provides consistency and correctness.
By encapsulating deployment logic in a script, the agent calls the script rather than constructing ad‑hoc commands, ensuring deterministic behavior.
Best Practice #4 – CI/CD Pipelines
Although not demonstrated due to time constraints, Matt emphasized the importance of CI/CD pipelines when working with agents:
- Automated testing to catch bugs.
- Security scanning for vulnerabilities.
- Cost analysis to prevent runaway expenses.
- Code review requirements.
These safeguards ensure that even if an agent generates problematic code, it won’t reach production.
Integrating Amazon Bedrock
With the application deployed, Matt integrated Amazon Bedrock to power the AI chat functionality. He used a prompt (generated by another agent) that included a detailed checklist:
- Add Bedrock SDK to the project.
- Create a Lambda function for Bedrock integration.
- Update CDK infrastructure.
- Connect front‑end to the new API.
- Test the integration.
The agent worked through each step, updating the CDK stack to include a new Lambda function with proper IAM permissions, wiring it to the front‑end, and verifying the chat feature.