create_summarization_tool_middleware(
model: str | BaseChatModel,
backend: BACKEND_TYPES
) -> SummarizationToolMiddleware| Name | Type | Description |
|---|---|---|
model* | str | BaseChatModel | Chat model instance, or a model string
(e.g. |
backend* | BACKEND_TYPES | Backend instance or factory for persisting conversation history. |
Create a SummarizationToolMiddleware with model-aware defaults.
Convenience factory: builds a SummarizationMiddleware via
create_summarization_middleware
and wraps it in a SummarizationToolMiddleware. Saves a step and
accepts a model string.
Only the tool layer is registered — the wrapped SummarizationMiddleware
is the engine the tool calls into, not a middleware that runs on its
own. The agent gains:
compact_conversation tool to compact its own context windowFor automatic summarization at the trigger threshold, also register
a SummarizationMiddleware. create_deep_agent adds one by default,
so dropping create_summarization_tool_middleware(...) into its
middleware=[...] gives you both layers; they share state via the
_summarization_event key.
Example:
Using the default StateBackend:
from deepagents import create_deep_agent
from deepagents.backends import StateBackend
from deepagents.middleware.summarization import (
create_summarization_tool_middleware,
)
model = "openai:gpt-5.4"
agent = create_deep_agent(
model=model,
middleware=[
create_summarization_tool_middleware(model, StateBackend),
],
)
Using a custom backend instance (e.g., Daytona Sandbox):
from daytona import Daytona
from deepagents import create_deep_agent
from deepagents.middleware.summarization import (
create_summarization_tool_middleware,
)
from langchain_daytona import DaytonaSandbox
sandbox = Daytona().create()
backend = DaytonaSandbox(sandbox=sandbox)
model = "openai:gpt-5.4"
agent = create_deep_agent(
model=model,
backend=backend,
middleware=[
create_summarization_tool_middleware(model, backend),
],
)