|
|
Building an MCP Server in Python
Author: Venkata Sudhakar
The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external tools and data sources through a common interface. Instead of each AI application building custom integrations for every tool, MCP defines a universal protocol: an MCP server exposes tools and resources, and any MCP client (Claude Desktop, Cursor, LangChain, and others) can connect to it and use those tools without any custom glue code. Building an MCP server means your tools become immediately available to any AI that speaks MCP. The Python mcp library (from Anthropic) makes building a server straightforward. You create a FastMCP server instance, decorate Python functions with @mcp.tool() to expose them as callable tools, and optionally decorate data-fetching functions with @mcp.resource() to expose structured data. The server communicates over stdio (for local tools) or SSE/HTTP (for remote servers). When an AI client calls your tool, MCP handles the JSON serialisation, argument validation, and error propagation automatically. The below example builds a complete MCP server for a data migration assistant that exposes three tools: row count comparison, CDC lag check, and schema validation.
Adding a resource and running the server,
Configuring in Claude Desktop (claude_desktop_config.json),
{
"mcpServers": {
"migration-assistant": {
"command": "python",
"args": ["/path/to/migration_server.py"]
}
}
}
# After restarting Claude Desktop, the tools appear automatically:
# - compare_row_counts
# - check_cdc_lag
# - validate_schema
# - migration://status resource
# Claude can now call these tools in conversation:
# User: "Are we ready to cut over the orders table?"
# Claude: [calls compare_row_counts("orders")] -> 3 rows missing, not ready
# Claude: [calls check_cdc_lag("migration-cdc")] -> 2s lag, still catching up
# Claude: "The orders table has 3 missing rows and CDC is still catching up.
# Wait for both to resolve before cutover."
MCP transforms your Python functions into universally usable AI tools with minimal code. The same server works with Claude Desktop, Cursor, any LangChain MCP client, and any other MCP-compatible host without modification. For production deployments, use transport="sse" with a proper HTTP server instead of stdio so the server runs as a persistent service rather than a subprocess. Add authentication by passing an auth handler to mcp.run() to control which clients can access your tools.
|
|