Advent of AI - Day 12: MCP Sampling
Source: Dev.to
I’ve edited this post, but AI helped. These are meant to be quick posts for the Advent of AI.
If I’m doing one of these each day, I don’t have time to spend a couple of hours on each post.
The Advent of AI series leverages Goose, an open‑source AI agent.
If you’ve never heard of it, check it out!
Goose – an open‑source, extensible AI agent that goes beyond code suggestions.
Install, execute, edit, and test with any LLM.
# Example command (replace with the actual usage you need)
goose block / goose
Goose
Your local, extensible, open‑source AI agent that automates engineering tasks
Goose is an on‑machine AI agent that can automate complex development tasks from start to finish.
Beyond simple code suggestions, Goose can:
- Create entire projects from scratch
- Write, run, and debug code automatically
- Orchestrate workflows and pipelines
- Interact with external APIs autonomously
Whether you’re prototyping, refactoring, or managing large engineering pipelines, Goose adapts to your workflow and executes tasks with precision.
Key Features
- LLM‑agnostic – works with any large language model and supports multi‑model configurations for optimal performance and cost.
- MCP integration – seamless connection to MCP servers.
- Cross‑platform – available as a desktop application and a CLI tool.
- Extensible – plug‑in architecture lets you add custom capabilities.
Quick Demo
Get Started
# Install the CLI (example for macOS/Linux)
curl -fsSL https://example.com/goose/install.sh | bash
# Verify installation
goose --version
Visit the documentation for detailed setup instructions, configuration guides, and API references.
Goose – the ultimate AI assistant for developers who want to move faster and focus on innovation.
Quick Links
Need Help?
The Challenge: Democracy by AI
Day 12 threw a Winter Festival mascot crisis at me. The Festival Committee spent three hours arguing about whether their mascot should be a snowman, penguin, polar bear, ice fairy, or yeti. Classic committee deadlock.
The challenge was to use the Council of Mine extension to get nine AI personalities to debate and vote on the decision. This teaches you about MCP sampling, which I’ll explain next.
What Actually Happened
I fired up Goose and tried to use Council of Mine. Even though the extension was installed and enabled, Goose couldn’t see it. I had to disable and re‑enable the extension several times before it finally appeared. Once it was working, I ran multiple debates:
- Mascot choice – spoiler: the yeti won
- Mascot name – we went with “Yuki”
- Origin story – monks in the mountains
- Personality traits – jovial, powerful, and wise
- Festival duration – expanded to five days
The council gave genuinely different perspectives each time. The Devil’s Advocate consistently pushed back on popular choices. The Systems Thinker worried about scalability. The Optimist found upsides in everything.
The MCP Sampling Thing
Here’s what makes this interesting beyond “AI voting on stuff.” Council of Mine doesn’t have its own LLM; it uses MCP sampling to ask the AI you’re already connected to for help.
Angie Jones (@techgirl1908) wrote about MCP sampling recently. Check it out!


MCP Sampling: When Your Tools Need to Think
Angie Jones for Block Open Source • Dec 9
Tags: #mcp #ai
Normal vs. MCP Sampling Flow
| Flow | Description |
|---|---|
| Normal MCP | You talk to Goose → Goose calls an MCP tool → The tool returns data. |
| MCP Sampling | You talk to Goose → Goose calls an MCP tool → The tool asks Goose’s LLM for help → The tool processes that response and returns it. |
The Council of Mine extension defines nine personality system prompts. When you start a debate it:
- Makes nine separate sampling calls (one per council member) with each personality prepended to the prompt.
- Makes another nine calls for voting.
- Makes one final call to synthesize the results.
Total: 19 LLM calls per debate, all routed through whatever AI model you have configured in Goose.
Why This Matters
Sampling lets MCP servers be intelligent without managing their own API keys, model selection, or LLM infrastructure. The server becomes an orchestrator, not an AI application.
Possible Use‑Cases
- Code review with multiple expert viewpoints.
- Documentation analyzer that explains concepts at different user levels.
- Search tool that intelligently filters and ranks results.
- Debate simulator for product decisions (the very example used here).
The Council of Mine repo includes examples of:
- Structuring sampling requests.
- Handling different LLM response formats.
- Protecting against prompt injection.
If you’re building MCP servers, it’s worth studying how they manage the nine distinct personalities.
Build Your Own MCP Sampling (MCP?)
Building your own MCP with sampling was a bonus challenge. I didn’t get around to it due to time constraints, but I opened an issue in my TypeScript MCP template repo a couple of weeks ago.
Issue #55 – Feature: Implement MCP Sampling Support
Feature: Implement MCP Sampling Support
#55
Issue details
nickytonline posted on Dec 04, 2025
Implement a bidirectional sampling pattern that allows MCP servers to request AI‑generated content from connected clients for intelligent judgment tasks like summarization, sentiment analysis, and content generation.
References
- MCP Sampling Blog Post by Angie Jones
- MCP TypeScript SDK
- MCP Sampling Specification
Server.createMessage()Documentation
The Takeaway
MCP sampling opens a new category of tools. You’re no longer just exposing data—you’re orchestrating intelligent behavior using whatever AI the user brings.
Tip: Extensions still have rough edges. Keep the disable/enable trick in your back pocket.
Want to try it yourself?
- Council of Mine docs:
- MCP sampling guide:
- Advent of AI challenge (Challenge 12):
Stay in touch via my socials at nickyt.online.
Photo by Laura Peruchi on Unsplash.

