|
|
LangChain LCEL - Building Chains with the Pipe Operator
Author: Venkata Sudhakar
LangChain Expression Language (LCEL) is the modern way to compose LangChain components. Instead of instantiating chain classes like LLMChain or RetrievalQA, you connect components together using the | pipe operator, just like Unix pipes. Each component in an LCEL chain is a Runnable - it has a standard invoke() interface so any Runnable can be connected to any other. The output of one step becomes the input of the next. This makes chains readable, composable, and easy to modify. The core building blocks are: ChatPromptTemplate (formats a prompt from variables), ChatOpenAI (the LLM), StrOutputParser (extracts the text from the LLM response), and RunnablePassthrough (passes the input through unchanged, useful for injecting context). You combine them with | and invoke the entire chain with chain.invoke({"variable": "value"}). LCEL chains also support streaming, async, batching, and LangSmith tracing out of the box with no extra code. The below example shows building chains of increasing complexity using LCEL - from a simple prompt-LLM-parser chain to a chain that formats context and a question together before calling the LLM.
It gives the following output,
Simple chain: Kafka consumer offsets track how far a consumer group has read
in each partition, enabling resumption from the correct position after a restart.
Chat chain: CDC replication lag is the delay in seconds between a change
happening in the source database and the consumer processing that change event.
It gives the following output,
RAG chain: CDC lag is the seconds between a change happening in the source
database and the consumer processing that change event.
Streaming:
A Kafka partition is an ordered, immutable log of messages within a topic
that enables parallel processing by multiple consumers.
LCEL vs the old chain classes: LCEL replaced LLMChain, ConversationalRetrievalChain, RetrievalQA and similar classes. They still work but are considered legacy. Use LCEL for all new code. The main advantage is transparency - you see exactly what each step does, you can replace any step, and you can insert RunnableLambda() anywhere to run arbitrary Python logic in the pipeline. Any Python function wrapped in RunnableLambda becomes a composable chain step.
|
|