|
|
Vertex AI Pipelines with ADK
Author: Venkata Sudhakar
Vertex AI Pipelines is a managed workflow orchestration service that runs sequences of steps as a directed acyclic graph (DAG). Combining it with ADK lets you build automated pipelines where agent reasoning is one step in a larger data processing or ML workflow. A nightly pipeline might: extract new customer reviews from BigQuery, run them through a Gemini structured output agent to classify sentiment and extract issues, write the structured results back to BigQuery, and trigger an alert if negative sentiment exceeds a threshold. This entire workflow runs on a schedule with no human intervention and full observability in the GCP Console. Vertex AI Pipelines uses the Kubeflow Pipelines SDK (KFP) where you define components as Python functions decorated with @dcc.component and wire them into a pipeline with @pipeline. Each component runs in its own isolated container. ADK agent calls fit naturally as pipeline components � a component receives data, runs an ADK agent query, and outputs the result to the next step. The pipeline is compiled to a YAML spec and submitted to Vertex AI which handles scheduling, retry, and logging automatically. The below example builds a nightly product review processing pipeline that reads new reviews, runs them through a Gemini classification agent, writes structured results to BigQuery, and sends a summary report � all orchestrated as a Vertex AI Pipeline that runs on a schedule.
Components 3 and 4 for writing results and sending the daily report,
It gives the following output when the pipeline runs on Vertex AI,
Pipeline submitted: projects/my-project/locations/us-central1/pipelineJobs/nightly-review-123
[Vertex AI Pipelines executes each component in order]
Component 1 - extract_reviews: Extracted 47 reviews
Component 2 - classify_reviews: Classified 47 reviews
Component 3 - write_to_bigquery: Inserted 47 rows | 8 negative reviews
Component 4 - send_alert: All good: 8 negative reviews (below threshold 10)
Pipeline Status: SUCCEEDED
Duration: 4m 32s
# Schedule this pipeline to run nightly via Vertex AI Pipelines scheduler:
# GCP Console > Vertex AI > Pipelines > Schedules > Create Schedule
# Or programmatically using aiplatform.PipelineJobSchedule
The pipeline now runs automatically every night at 2am IST,
Schedule created: projects/my-project/locations/us-central1/schedules/abc123
# Pipeline runs every night: 2am IST = 8:30pm UTC
# Each run processes previous day reviews automatically
# All run history visible in GCP Console > Vertex AI > Pipelines
# Failed runs auto-retry and trigger alerts via Cloud Monitoring
Vertex AI Pipelines with ADK is the right architecture when you need: scheduled batch processing of large datasets, reproducible and auditable AI workflows, automatic retry and failure handling at the step level, and integration with the broader GCP data ecosystem (BigQuery, Cloud Storage, Pub/Sub). For real-time agent workflows use Agent Engine (Tutorial 320). For scheduled batch AI processing use Vertex AI Pipelines. The two complement each other: Agent Engine for interactive user-facing agents, Vertex AI Pipelines for back-office automation that runs while your team sleeps.
|
|