|
|
ADK Agent Triggering Databricks Workflows
Author: Venkata Sudhakar
Databricks Workflows are scheduled or on-demand orchestration pipelines that run notebooks, Python scripts, and dbt projects against Delta Lake data. Rather than requiring engineers to trigger these jobs manually from the Databricks UI, an ADK agent can accept a plain-language instruction and fire the appropriate job via the Databricks Jobs REST API, then poll for completion and report the result.
ShopMax India runs nightly ETL jobs, weekly sales reports, and on-demand inventory reconciliation workflows in Databricks. The operations team can now ask an ADK agent to trigger any of these jobs in natural language, monitor progress, and receive a summary when the run completes - without needing access to the Databricks workspace UI.
The below example shows how to create ADK tools that trigger and monitor Databricks Jobs for ShopMax India operations workflows.
It gives the following output,
Operations agent ready.
Registered jobs: ['inventory_reconciliation', 'weekly_sales_report',
'customer_data_refresh', 'demand_forecast_retrain']
FunctionTools registered: trigger_job, check_job_status
Databricks Jobs API: https://adb-1234567890.azuredatabricks.net/api/2.1/jobs
The below example shows a ShopMax India operations manager triggering an inventory reconciliation job and checking its status through the ADK agent interface.
It gives the following output,
Manager: Trigger the inventory reconciliation job for the Mumbai warehouse.
Agent: Triggered job "inventory_reconciliation" with parameter city=Mumbai.
Run ID: 77841 | Status: TRIGGERED
You can monitor progress by asking me to check run 77841.
Manager: Check the status of run ID 77841.
Agent: Job run 77841 status:
Life cycle state: RUNNING
Progress: Notebook task 2 of 3 executing (warehouse stock validation)
No errors detected so far.
Manager: Run the weekly sales report job with city parameter set to Bangalore.
Agent: Triggered job "weekly_sales_report" with parameter city=Bangalore.
Run ID: 77855 | Status: TRIGGERED
Expected completion: ~12 minutes based on recent run history.
ADK agents as Databricks Workflow orchestrators give ShopMax India operations teams a natural language control plane for their data pipelines. Engineers retain full control over job definitions in Databricks while business users trigger and monitor runs without needing workspace access - bridging the gap between data engineering and day-to-day operations management.
|
|