Giving Your AI the Right Context with Model Context Protocol (MCP)
Source: Dev.to
Introduction
Nowadays, pretty much everyone works with AI one way or another—whether it’s writing code, debugging, or designing infrastructure. Large language models (LLMs) have pushed our productivity to a new level, but we can make them even more efficient by giving them the right context. That’s where the Model Context Protocol (MCP) comes in.
What is the Model Context Protocol?
MCP, designed by Anthropic, defines a standard way for AI models to discover and call external tools. Your application becomes an MCP server that advertises its capabilities (search, fetch, create, etc.). The model can then invoke those tools to obtain the exact context it needs, eliminating manual copy‑pasting of data schemas or JSON blobs.
Building a Simple MCP Server in Go
Below is a minimal example of an MCP server that provides a library catalog with two tools:
search_books– search the catalog.get_book– retrieve full details for a book by its ID.
MCP Server Configuration (.mcp.json)
{
"mcpServers": {
"library": {
"command": "./mcp-library",
"args": []
}
}
}
Place this file in the root of your workspace. It tells the AI client (e.g., Claude Code) how to start the MCP server.
Server Implementation (Go)
package main
import (
"context"
"encoding/json"
"log"
"github.com/anthropic/mcp-go"
)
func main() {
server := mcp.NewServer(&mcp.Implementation{
Name: "library-mcp",
Version: "1.0.0",
}, nil)
// Tool: search_books
server.AddTool(
&mcp.Tool{
Name: "search_books",
Description: "Search the library catalog. Returns matching books.",
InputSchema: json.RawMessage(`{
"type": "object",
"properties": {
"title": {"type": "string", "description": "Filter by book title."},
"author": {"type": "string", "description": "Filter by author name."}
}
}`),
},
func(ctx context.Context, req *mcp.CallToolRequest) (*mcp.CallToolResult, error) {
args := parseArgs(req)
result, err := searchBooks(ctx, str(args, "title"), str(args, "author"))
return textResult(result, err)
},
)
// Tool: get_book
server.AddTool(
&mcp.Tool{
Name: "get_book",
Description: "Get full details for a book by its ID, including loan history.",
InputSchema: json.RawMessage(`{
"type": "object",
"properties": {
"id": {"type": "string", "description": "The book ID."}
},
"required": ["id"]
}`),
},
func(ctx context.Context, req *mcp.CallToolRequest) (*mcp.CallToolResult, error) {
args := parseArgs(req)
result, err := getBook(ctx, str(args, "id"))
return textResult(result, err)
},
)
if err := server.Run(context.Background(), &mcp.StdioTransport{}); err != nil {
log.Fatal(err)
}
}
Each tool declares its name, description, and input schema—the information the model sees. When the model needs to find books by Tolkien, it calls search_books with {"author":"Tolkien"}; the server performs the lookup and returns the results.
Using the Server with Claude Code
- Run the MCP server (
./mcp-library). - Open Claude Code in the same project directory.
- Ask a question, e.g., “Are there any books by Tolkien?”
Claude detects the search_books tool, invokes it with the appropriate filter, and returns the matching books—no extra prompting required.
Extending the Example
The demo uses hard‑coded data for simplicity. To make it production‑ready, replace the mock functions (searchBooks, getBook) with real database queries or API calls. The same pattern works for any backend you need to expose to an LLM.
Conclusion
MCP bridges the gap between what a model can do and what it knows about your specific context. By exposing your data and operations through a standard protocol, you let the model explore your system directly, saving time and reducing friction. The protocol is open, SDKs exist for multiple languages, and integration with tools like Claude Code is already available.
If you’re building anything where an LLM could benefit from knowing your data—and that’s most things these days—MCP is worth exploring.
The full source code for this example is available here.