LangChain OpenRouter integration.
OpenRouter chat models.
Model profile data. All edits should be made in profile_augmentations.toml.
OpenRouter chat model integration.
OpenRouter is a unified API that provides access to hundreds of models from multiple providers (OpenAI, Anthropic, Google, Meta, etc.).
Install langchain-openrouter and set environment variable
OPENROUTER_API_KEY.
pip install -U langchain-openrouter
export OPENROUTER_API_KEY="your-api-key"| Param | Type | Description |
|---|---|---|
model |
str |
Model name, e.g. 'openai/gpt-4o-mini'. |
temperature |
`float | None` |
max_tokens |
`int | None` |
| Param | Type | Description |
|---|---|---|
api_key |
`str | None` |
base_url |
`str | None` |
timeout |
`int | None` |
app_url |
`str | None` |
app_title |
`str | None` |
max_retries |
int |
Max retries (default 2). Set to 0 to disable. |
from langchain_openrouter import ChatOpenRouter
model = ChatOpenRouter(
model="anthropic/claude-sonnet-4-5",
temperature=0,
# api_key="...",
# openrouter_provider={"order": ["Anthropic"]},
)See https://openrouter.ai/docs for platform documentation.