# ChatOpenRouter

> **Class** in `langchain_openrouter`

📖 [View in docs](https://reference.langchain.com/python/langchain-openrouter/chat_models/ChatOpenRouter)

OpenRouter chat model integration.

OpenRouter is a unified API that provides access to hundreds of models from
multiple providers (OpenAI, Anthropic, Google, Meta, etc.).

???+ info "Setup"

    Install `langchain-openrouter` and set environment variable
    `OPENROUTER_API_KEY`.

    ```bash
    pip install -U langchain-openrouter
    ```

    ```bash
    export OPENROUTER_API_KEY="your-api-key"
    ```

??? info "Key init args — completion params"

    | Param | Type | Description |
    | ----- | ---- | ----------- |
    | `model` | `str` | Model name, e.g. `'openai/gpt-4o-mini'`. |
    | `temperature` | `float | None` | Sampling temperature. |
    | `max_tokens` | `int | None` | Max tokens to generate. |

??? info "Key init args — client params"

    | Param | Type | Description |
    | ----- | ---- | ----------- |
    | `api_key` | `str | None` | OpenRouter API key. |
    | `base_url` | `str | None` | Base URL for API requests. |
    | `timeout` | `int | None` | Timeout in milliseconds. |
    | `app_url` | `str | None` | App URL for attribution. |
    | `app_title` | `str | None` | App title for attribution. |
    | `app_categories` | `list[str] | None` | Marketplace attribution categories. |
    | `max_retries` | `int` | Max retries (default `2`). Set to `0` to disable. |

??? info "Instantiate"

    ```python
    from langchain_openrouter import ChatOpenRouter

    model = ChatOpenRouter(
        model="anthropic/claude-sonnet-4-5",
        temperature=0,
        # api_key="...",
        # openrouter_provider={"order": ["Anthropic"]},
    )
    ```

See https://openrouter.ai/docs for platform documentation.

## Signature

```python
ChatOpenRouter()
```

## Extends

- `BaseChatModel`

## Properties

- `client`
- `openrouter_api_key`
- `openrouter_api_base`
- `app_url`
- `app_title`
- `app_categories`
- `request_timeout`
- `max_retries`
- `model_name`
- `model`
- `temperature`
- `max_tokens`
- `max_completion_tokens`
- `top_p`
- `frequency_penalty`
- `presence_penalty`
- `seed`
- `stop`
- `n`
- `streaming`
- `stream_usage`
- `model_kwargs`
- `reasoning`
- `openrouter_provider`
- `route`
- `plugins`
- `model_config`
- `lc_secrets`

## Methods

- [`build_extra()`](https://reference.langchain.com/python/langchain-openrouter/chat_models/ChatOpenRouter/build_extra)
- [`validate_environment()`](https://reference.langchain.com/python/langchain-openrouter/chat_models/ChatOpenRouter/validate_environment)
- [`is_lc_serializable()`](https://reference.langchain.com/python/langchain-openrouter/chat_models/ChatOpenRouter/is_lc_serializable)
- [`bind_tools()`](https://reference.langchain.com/python/langchain-openrouter/chat_models/ChatOpenRouter/bind_tools)
- [`with_structured_output()`](https://reference.langchain.com/python/langchain-openrouter/chat_models/ChatOpenRouter/with_structured_output)

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/fb6ab993a73180538f6cca876b3c85d46c08845f/libs/partners/openrouter/langchain_openrouter/chat_models.py#L85)