# ChatAmazonNova

> **Class** in `langchain_amazon_nova`

📖 [View in docs](https://reference.langchain.com/python/langchain-amazon-nova/chat_models/ChatAmazonNova)

Amazon Nova chat model integration.

Amazon Nova models are OpenAI-compatible and accessed via the OpenAI SDK
pointed at Nova's endpoint.

## Signature

```python
ChatAmazonNova()
```

## Description

**Setup:**

Install langchain-amazon-nova:
    pip install langchain-amazon-nova

Set environment variables:
    export NOVA_API_KEY="your-api-key"
    export NOVA_BASE_URL="https://api.nova.amazon.com/v1"

Key init args — completion:
model: str
    Name of Nova model to use.
temperature: float
    Sampling temperature.
max_tokens: Optional[int]
    Max number of tokens to generate.
max_completion_tokens: Optional[int]
    Max tokens in completion (OpenAI compatible param).
top_p: Optional[float]
    Nucleus sampling threshold.
reasoning_effort: Optional[Literal["low", "medium", "high"]]
    Reasoning effort level for reasoning models.
metadata: Optional[Dict[str, Any]]
    Request metadata for tracking.
stream_options: Optional[Dict[str, bool]]
    Stream options (e.g., include_usage).
system_tools: Optional[List[Literal["nova_grounding", "nova_code_interpreter"]]]
    System tools (e.g. 'nova_grounding', 'nova_code_interpreter')

**See the official documentation for additional parameters and details:**

https://nova.amazon.com/dev/documentation

Key init args — client:
api_key: Optional[SecretStr]
    Nova API key. If not passed in will be read from env var NOVA_API_KEY.
base_url: Optional[str]
    Base URL for API requests. Defaults to Nova endpoint from NOVA_BASE_URL.

**Instantiate:**

.. code-block:: python

from langchain_amazon_nova import ChatAmazonNova

llm = ChatAmazonNova(
    model="nova-pro-v1",
    temperature=0.7,
    max_tokens=2048,
)

**Invoke:**

.. code-block:: python

    messages = [
        ("system", "You are a helpful assistant."),
        ("human", "What is the capital of France?"),
    ]
    llm.invoke(messages)

.. code-block:: python

    AIMessage(content='The capital of France is Paris.', ...)

**Stream:**

.. code-block:: python

for chunk in llm.stream(messages):
    print(chunk.content, end="", flush=True)

**Async:**

.. code-block:: python

await llm.ainvoke(messages)

**Tool calling:**

.. code-block:: python

from pydantic import BaseModel, Field

class GetWeather(BaseModel):
    '''Get the weather for a location.'''

    location: str = Field(..., description="City name")

llm_with_tools = llm.bind_tools([GetWeather])
llm_with_tools.invoke("What's the weather in Paris?")

## Extends

- `BaseChatModel`

## Properties

- `model_name`
- `temperature`
- `max_tokens`
- `max_completion_tokens`
- `top_p`
- `reasoning_effort`
- `metadata`
- `stream_options`
- `system_tools`
- `api_key`
- `base_url`
- `timeout`
- `max_retries`
- `streaming`
- `client`
- `async_client`
- `model_config`
- `lc_secrets`

## Methods

- [`validate_environment()`](https://reference.langchain.com/python/langchain-amazon-nova/chat_models/ChatAmazonNova/validate_environment)
- [`bind_tools()`](https://reference.langchain.com/python/langchain-amazon-nova/chat_models/ChatAmazonNova/bind_tools)
- [`with_structured_output()`](https://reference.langchain.com/python/langchain-amazon-nova/chat_models/ChatAmazonNova/with_structured_output)

---

[View source on GitHub](https://github.com/amazon-nova-api/langchain-amazon-nova/blob/213daad519b5c69124a9029884d7f593944bd35f/libs/amazon_nova/langchain_amazon_nova/chat_models.py#L104)