|
|
Gemini API with LangChain
Author: Venkata Sudhakar
LangChain is a popular framework for building LLM-powered applications with chains, memory, and retrieval components. Gemini can be used as the LLM backend in any LangChain pipeline via the langchain-google-genai package, giving you access to Gemini models within familiar LangChain abstractions. ShopMax India uses this to build its product FAQ chain. The langchain-google-genai package provides ChatGoogleGenerativeAI, a drop-in LangChain chat model backed by Gemini. It supports streaming, conversation memory, system prompts, and integration with all standard LangChain chains and runnables. The below example shows how to configure Gemini as the LLM backend in a LangChain conversation chain.
It gives the following output,
ShopMax India carries a wide range of TV brands including Samsung, LG, Sony, OnePlus, and TCL. We stock everything from budget 32-inch HD TVs to premium 75-inch OLED and QLED models. Would you like recommendations for a specific budget or screen size?
The below example shows a LangChain RAG chain that retrieves product information from a vector store before answering customer queries.
It gives the following output,
The LG OLED 65 inch TV has the longest warranty at 2 years. It is priced at Rs 1,20,000 with free delivery across India. The Samsung 55 inch and Sony Bravia both come with a 1-year warranty.
ShopMax India uses LangChain with Gemini to power its customer FAQ system. Teams already familiar with LangChain can migrate to Gemini by changing only the LLM backend, while retaining all existing chain logic, memory configuration, and retrieval pipelines - making it the fastest path to Gemini adoption in existing LangChain projects.
|
|