From Prompts to Action: My Journey Through the Google & Kaggle AI Agents Bootcamp

Published: (December 14, 2025 at 09:03 PM EST)
5 min read
Source: Dev.to

Source: Dev.to

This is a submission for the Google AI Agents Writing Challenge: Learning Reflections

It turns out, if you can write a Python function, you can build an agent. Below is my deep dive into the code, concepts, and tools that made this journey accessible, featuring my capstone project: Jarbest.

The Awakening: Hello, Agent

Coming from a non‑developer background, I always imagined software as a “bricklayer”—rigidly following a blueprint. Day 1 introduced me to the Agent: a system that acts more like a film director. It doesn’t just predict text; it has a Brain (the model), Hands (tools), and a Nervous System (orchestration) to autonomously perceive, reason, and act.

An agent operates in a continuous loop—Mission → Scan → Think → Act → Observe—constantly adapting its plan to solve problems. This framework demystified the magic: I wasn’t just coding a chatbot; I was building a system with agency to execute multi‑step missions.

The “Aha!” Moment: It’s Just Python & the “USB Port” for AI

The illustration

Day 2 was a revelation. Models are just “Brains”—pattern predictors that cannot see or act. To be useful, they need Tools: the “Eyes” and “Hands” that let them fetch data or execute actions.

Connecting every tool to every model quickly becomes a nightmare (the N × M problem). Enter the Model Context Protocol (MCP).

Think of MCP as the USB port for AI. Before USB you needed a specific cable for every device. MCP lets you plug any tool into any agent using a standard connection.

The Code: Giving the Agent “Hands”

In my project, Jarbest (an accessible personal companion), I needed an agent that could check bank account balances. Instead of writing a custom connector, I used MCP to “plug in” a secure banking server.

# Finance Agent: Manages the bank's transactions
finance_agent = Agent(
    name="finance_agent",
    description="An agent that can help with banking operations like checking balances...",
    # This toolset connects to a secure internal banking server
    tools=[
        MCPToolset(
            connection_params=StreamableHTTPConnectionParams(
                url=f"{BANK_MCP_URL.rstrip('/')}/mcp",
            )
        )
    ],
)

Why this matters (and the danger)

The agent reads these tool definitions and knows exactly when to use them. If a user asks “Can I afford this pizza?” the agent inherently knows it must first call check_balance.

However, the notes warned us: Using MCP is like plugging in a random USB drive found on the street. It could be a legitimate tool, or a “Tool Shadow” (a malicious copy). That’s why in Jarbest I implemented a strict Application‑Layer Gateway (via hard‑coded allowlists)—ensuring the agent can only connect to my specific, internal MCP banking server, preventing it from ever “plugging in” to an untrusted source.

Deep Dive: The Brain (Memory)

The Memory

Day 3 was where things got sophisticated. A chatbot forgets you the moment you close the tab. An agent remembers.

For Jarbest, which is designed for elderly users who value consistency, memory is critical. If “Grandma Jane” asks for her “usual order,” the agent shouldn’t ask “What is that?”; it should know.

Here’s how I implemented the “Brain” in my root agent:

root_agent = Agent(
    name='root_agent',
    instruction="""
    You are Jarbest...
    Memory: Use the load_memory tool to recall past conversations and preferences 
    (e.g., "ordering the usual").
    """,
    tools=[load_memory],  # <--- This single line gives the agent a "brain"
    after_agent_callback=auto_save_to_memory  # Auto‑saves every interaction
)

The Non‑Developer Perspective

Think of load_memory like giving the agent a filing cabinet. When Grandma Jane says “Order me some food,” the agent thinks: “I need to check if she has a preference,” opens the cabinet (load_memory), finds “Likes Large Pepperoni Pizza,” and acts on it. Watching this thought process in real‑time was mind‑blowing.

The “Squeeze”: Debugging the Black Box

Debugging illustration

Day 4 taught us that “it works” isn’t enough. You need to know why it works. When building a safety‑focused agent like Jarbest, I couldn’t afford “hallucinations.”

Exploring the Agent Observability labs, I learned to trace the agent’s reasoning steps. When my agent refused to order a pizza, I could look at the trace and see:

  • User: “Order a pizza.”
  • Tool Call: check_balance → returned $5.00.
  • Reasoning: “Pizza costs $20. User has $5. Result: Unsafe.”
  • Response: “I cannot complete this order because your balance is too low.”

Seeing that raw reasoning log felt like looking into the matrix. It transformed the LLM from a mysterious oracle into a logical, debuggable software component. I realized I wasn’t just “prompting” anymore; I was engineering logic.

The Ecosystem: Agents Talking to Agents (A2A)

Day 5 introduced the Agent‑to‑Agent (A2A) Protocol. This is where I moved from building a single assistant to building a team.

My “Purchaser Agent” doesn’t know how to make pizza. Instead, it connects to a completely separate “Pizza Shop Agent” (simulating a third‑party vendor).

# Creating a client‑side proxy for a remote agent
pizza_agent_proxy = RemoteA2aAgent(
    name="pizza_agent",
    # The "Agent Card" acts like a business card for discovery
    agent_card="http://localhost:10000/.well-known/agent-card.json",
    description="Remote pizza agent from external vendor...",
)

purchaser_agent = Agent(
    name="purchaser_agent",
    instruction="Your goal is to help the user find and buy items.",
    tools=[AgentTool(pizza_agent_proxy)],  # <--- Treating another agent as a tool
)

The Cool Idea

The “Agent Card” isn’t just a technical manifest; it’s a completely new way for businesses to interact.

  • For SMBs (Small to Medium Businesses): Instead of constantly maintaining and documenting complex APIs for developers, you can expose an Agent Card that any compatible AI agent can discover and use as a plug‑and‑play service.
Back to Blog

Related posts

Read more »