tl  tr
  Home | Tutorials | Articles | Videos | Products | Tools | Search
Interviews | Open Source | Tag Cloud | Follow Us | Bookmark | Contact   
 Agentic AI > MCP Protocol > BigQuery MCP Server - Querying Data with ADK Agents

BigQuery MCP Server - Querying Data with ADK Agents

Author: Venkata Sudhakar

BigQuery MCP Server - Querying Data with ADK Agents

BigQuery is Google's fully managed, serverless data warehouse. In this tutorial we build a custom MCP server that wraps BigQuery using FastMCP, then connect an ADK agent to it so the agent can query warehouse data through natural tool calls.

Architecture Overview

The MCP server exposes BigQuery operations as tools. The ADK agent calls those tools using MCPToolset over stdio transport. This keeps the BigQuery credentials and client code inside the server, while the agent focuses purely on reasoning.

# bigquery_mcp_server.py
from mcp.server.fastmcp import FastMCP
from google.cloud import bigquery
import os

mcp = FastMCP("shopmax-bigquery")
client = bigquery.Client(project=os.environ["GCP_PROJECT_ID"])

@mcp.tool()
def get_sales_summary(start_date: str, end_date: str) -> dict:
    """Return order count and total revenue grouped by city and product category."""
    sql = """
        SELECT city, product_category,
               COUNT(*) AS orders,
               SUM(amount_rs) AS total_rs
        FROM `shopmax.orders`
        WHERE order_date BETWEEN @start AND @end
        GROUP BY city, product_category
        ORDER BY total_rs DESC
        LIMIT 20
    """
    job_config = bigquery.QueryJobConfig(
        query_parameters=[
            bigquery.ScalarQueryParameter("start", "DATE", start_date),
            bigquery.ScalarQueryParameter("end", "DATE", end_date),
        ]
    )
    rows = client.query(sql, job_config=job_config).result()
    return {"rows": [dict(row) for row in rows]}


@mcp.tool()
def get_top_products(limit: int = 10) -> dict:
    """Return the top N products by total units sold."""
    sql = """
        SELECT product_name, SUM(quantity) AS units_sold
        FROM `shopmax.order_items`
        GROUP BY product_name
        ORDER BY units_sold DESC
        LIMIT @limit
    """
    job_config = bigquery.QueryJobConfig(
        query_parameters=[
            bigquery.ScalarQueryParameter("limit", "INT64", limit),
        ]
    )
    rows = client.query(sql, job_config=job_config).result()
    return {"products": [dict(row) for row in rows]}


if __name__ == "__main__":
    mcp.run(transport="stdio")

ADK Agent Connecting to the MCP Server

# agent.py
import asyncio
from google.adk.agents import LlmAgent
from google.adk.tools.mcp_tool.mcp_toolset import MCPToolset, StdioServerParameters
from google.adk.sessions import InMemorySessionService
from google.adk.runners import Runner
from google.genai import types

APP_NAME = "bigquery-agent"
USER_ID = "analyst-1"
SESSION_ID = "session-1"

async def main():
    tools, exit_stack = await MCPToolset.from_server(
        connection_params=StdioServerParameters(
            command="python",
            args=["bigquery_mcp_server.py"],
        )
    )

    agent = LlmAgent(
        model="gemini-2.0-flash",
        name="BigQuery Analyst",
        instruction="You are a data analyst. Use the available tools to answer questions about sales data.",
        tools=tools,
    )

    session_service = InMemorySessionService()
    session = await session_service.create_session(
        app_name=APP_NAME, user_id=USER_ID, session_id=SESSION_ID
    )

    runner = Runner(
        agent=agent,
        app_name=APP_NAME,
        session_service=session_service,
    )

    query = "What are the top 5 cities by revenue for January 2025?"
    content = types.Content(role="user", parts=[types.Part(text=query)])

    async for event in runner.run_async(
        user_id=USER_ID, session_id=SESSION_ID, new_message=content
    ):
        if event.is_final_response():
            print(event.content.parts[0].text)

    await exit_stack.aclose()

asyncio.run(main())

Running the Example

pip install google-adk google-cloud-bigquery mcp
export GCP_PROJECT_ID=your-project-id
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json
python agent.py

Key Points

  • The MCP server runs as a subprocess via stdio - no network port needed
  • Parameterized queries via ScalarQueryParameter prevent SQL injection
  • ADK discovers the tools automatically from the server at startup
  • The BigQuery client credentials stay inside the server process

 
  


  
bl  br