This package contains the LangChain.js integrations for OpenRouter.
npm install @langchain/openrouter @langchain/core
This package, along with the main LangChain package, depends on @langchain/core.
If you are using this package with other LangChain packages, you should make sure that all of the packages depend on the same instance of @langchain/core.
You can do so by adding appropriate fields to your project's package.json like this:
{
"name": "your-project",
"version": "0.0.0",
"dependencies": {
"@langchain/openrouter": "^0.0.1",
"@langchain/core": "^1.0.0"
},
"resolutions": {
"@langchain/core": "^1.0.0"
},
"overrides": {
"@langchain/core": "^1.0.0"
},
"pnpm": {
"overrides": {
"@langchain/core": "^1.0.0"
}
}
}
The field you need depends on the package manager you're using, but we recommend adding a field for the common pnpm, npm, and yarn to maximize compatibility.
This package contains the ChatOpenRouter class, which is the recommended way to interface with any model available on OpenRouter. Pass any OpenRouter model identifier (e.g. "anthropic/claude-4-sonnet", "openai/gpt-4o") as the model param.
Set the necessary environment variable (or pass it in via the constructor):
export OPENROUTER_API_KEY=your-api-key
Then initialize
import { ChatOpenRouter } from "@langchain/openrouter";
const model = new ChatOpenRouter({
model: "openai/gpt-4o",
});
const response = await model.invoke([{ role: "user", content: "Hello world!" }]);
import { ChatOpenRouter } from "@langchain/openrouter";
const model = new ChatOpenRouter({
model: "openai/gpt-4o",
});
const stream = await model.stream([{ role: "user", content: "Hello world!" }]);
for await (const chunk of stream) {
console.log(chunk.content);
}
import { ChatOpenRouter } from "@langchain/openrouter";
import { tool } from "@langchain/core/tools";
import { z } from "zod";
const adder = tool(async ({ a, b }) => `${a + b}`, {
name: "add",
description: "Add two numbers",
schema: z.object({ a: z.number(), b: z.number() }),
});
const model = new ChatOpenRouter({
model: "openai/gpt-4o",
}).bindTools([adder]);
const response = await model.invoke("What is 2 + 3?");
import { ChatOpenRouter } from "@langchain/openrouter";
import { z } from "zod";
const model = new ChatOpenRouter({
model: "openai/gpt-4o",
});
const structured = model.withStructuredOutput(
z.object({
answer: z.string(),
confidence: z.number(),
})
);
const response = await structured.invoke("What is the capital of France?");
OpenRouter supports model routing, provider preferences, and plugins:
import { ChatOpenRouter } from "@langchain/openrouter";
const model = new ChatOpenRouter({
model: "openai/gpt-4o",
models: ["openai/gpt-4o", "anthropic/claude-4-sonnet"],
route: "fallback",
provider: {
allow_fallbacks: true,
},
transforms: ["middle-out"],
});
To develop the @langchain/openrouter package, you'll need to follow these instructions:
pnpm install
pnpm build
Or from the repo root:
pnpm build --filter @langchain/openrouter
Test files should live within a tests/ file in the src/ folder. Unit tests should end in .test.ts and integration tests should
end in .int.test.ts:
pnpm test
pnpm test:int
Run the linter & formatter to ensure your code is up to standard:
pnpm lint && pnpm format
If you add a new file to be exported, either import & re-export from src/index.ts, or add it to the exports field in the package.json file and run pnpm build to generate the new entrypoint.
Per-call options for ChatOpenRouter.
Shared fields that can be set at construction time or overridden per-call.
Constructor parameters for ChatOpenRouter.
Plugin configuration for OpenRouter plugins (e.g. web search).