tl  tr
  Home | Tutorials | Articles | Videos | Products | Tools | Search
Interviews | Open Source | Tag Cloud | Follow Us | Bookmark | Contact   
 Generative AI > Google Gemini API > ADK Retrieval-Augmented Generation Patterns

ADK Retrieval-Augmented Generation Patterns

Author: Venkata Sudhakar

Retrieval-Augmented Generation (RAG) lets an ADK agent answer questions using your own documents rather than relying only on the model's training data. For ShopMax India, this means the agent can answer questions about specific warranty terms, return policies, or technical specifications by retrieving the exact relevant section from internal documents before generating a response.

The pattern uses Vertex AI Vector Search (or any vector store) as a tool. When the agent receives a query, it first retrieves the most relevant document chunks, then uses those chunks as grounding context to produce an accurate, citation-backed answer. The example below simulates the retrieval layer with an in-memory vector store using cosine similarity.


It gives the following output,

Q: What is ShopMax India return policy?
A: Based on ShopMax India policy documents: You have a 10-day return window for
electronics in original unopened condition. Defective products qualify for return
within 30 days regardless of condition.

Q: Can I get EMI on a Rs 15,000 purchase?
A: Yes. ShopMax India offers 0% EMI on orders above Rs 10,000 for 3, 6, and 12
month tenures on HDFC, ICICI, and SBI credit cards. A processing fee of Rs 199
applies per transaction.

Q: How long does delivery take to Bangalore?
A: Standard delivery takes 3-5 business days. Bangalore is eligible for express
next-day delivery for Rs 99 additional charge.

Each response is grounded in the retrieved document chunk. The agent did not fabricate policy details - it retrieved the exact relevant passage and summarised it. The cosine similarity score shows which document was most relevant to each query, which can be logged for audit purposes.

For ShopMax India production, replace the in-memory vector store with Vertex AI Vector Search. Upload your policy PDFs, FAQ pages, and product manuals to Cloud Storage, chunk them into 200-500 token segments, embed each with text-embedding-004, and index them in Vertex AI Vector Search. The search_policy_docs tool then calls the Vector Search endpoint instead of computing cosine similarity locally - giving you millisecond retrieval across millions of documents.


 
  


  
bl  br