Configuration options for the summarization middleware
A middleware instance
import { summarizationMiddleware } from "langchain";
import { createAgent } from "langchain";
const agent = createAgent({
llm: model,
tools: [getWeather],
middleware: [
summarizationMiddleware({
model: new ChatOpenAI({ model: "gpt-4o" }),
maxTokensBeforeSummary: 4000,
messagesToKeep: 20,
})
],
});
Summarization middleware that automatically summarizes conversation history when token limits are approached.
This middleware monitors message token counts and automatically summarizes older messages when a threshold is reached, preserving recent messages and maintaining context continuity by ensuring AI/Tool message pairs remain together.