Model fallback middleware for agents.
Initialize a chat model from any supported provider using a unified interface.
Two main use cases:
config. Makes it easy to switch between models/providers without
changing your codeRequires the integration package for the chosen model provider to be installed.
See the model_provider parameter below for specific package names
(e.g., pip install langchain-openai).
Refer to the provider integration's API reference
for supported model parameters to use as **kwargs.
Base middleware class for an agent.
Subclass this and implement any of the defined methods to customize agent behavior between steps in the main agent loop.
State schema for the agent.
Model request information for the agent.
Response from model execution including messages and optional structured output.
The result will usually contain a single AIMessage, but may include an additional
ToolMessage if the model used a tool for structured output.
Automatic fallback to alternative models on errors.
Retries failed model calls with alternative models in sequence until
success or all models exhausted. Primary model specified in create_agent.