# ChatDeepSeek

> **Class** in `langchain_deepseek`

📖 [View in docs](https://reference.langchain.com/python/langchain-deepseek/chat_models/ChatDeepSeek)

DeepSeek chat model integration to access models hosted in DeepSeek's API.

## Signature

```python
ChatDeepSeek()
```

## Description

**Setup:**

Install `langchain-deepseek` and set environment variable `DEEPSEEK_API_KEY`.

```bash
pip install -U langchain-deepseek
export DEEPSEEK_API_KEY="your-api-key"
```

Key init args — completion params:
    model:
        Name of DeepSeek model to use, e.g. `'deepseek-chat'`.
    temperature:
        Sampling temperature.
    max_tokens:
        Max number of tokens to generate.

Key init args — client params:
    timeout:
        Timeout for requests.
    max_retries:
        Max number of retries.
    api_key:
        DeepSeek API key. If not passed in will be read from env var `DEEPSEEK_API_KEY`.

See full list of supported init args and their descriptions in the params section.

**Instantiate:**

```python
from langchain_deepseek import ChatDeepSeek

model = ChatDeepSeek(
    model="...",
    temperature=0,
    max_tokens=None,
    timeout=None,
    max_retries=2,
    # api_key="...",
    # other params...
)
```

**Invoke:**

```python
messages = [
    ("system", "You are a helpful translator. Translate the user sentence to French."),
    ("human", "I love programming."),
]
model.invoke(messages)
```

**Stream:**

```python
for chunk in model.stream(messages):
    print(chunk.text, end="")
```
```python
stream = model.stream(messages)
full = next(stream)
for chunk in stream:
    full += chunk
full
```

**Async:**

```python
await model.ainvoke(messages)

# stream:
# async for chunk in (await model.astream(messages))

# batch:
# await model.abatch([messages])
```

**Tool calling:**

```python
from pydantic import BaseModel, Field

class GetWeather(BaseModel):
    '''Get the current weather in a given location'''

    location: str = Field(..., description="The city and state, e.g. San Francisco, CA")

class GetPopulation(BaseModel):
    '''Get the current population in a given location'''

    location: str = Field(..., description="The city and state, e.g. San Francisco, CA")

model_with_tools = model.bind_tools([GetWeather, GetPopulation])
ai_msg = model_with_tools.invoke("Which city is hotter today and which is bigger: LA or NY?")
ai_msg.tool_calls
```

See `ChatDeepSeek.bind_tools()` method for more.

**Structured output:**

```python
from typing import Optional

from pydantic import BaseModel, Field

class Joke(BaseModel):
    '''Joke to tell user.'''

    setup: str = Field(description="The setup of the joke")
    punchline: str = Field(description="The punchline to the joke")
    rating: int | None = Field(description="How funny the joke is, from 1 to 10")

structured_model = model.with_structured_output(Joke)
structured_model.invoke("Tell me a joke about cats")
```

See `ChatDeepSeek.with_structured_output()` for more.

**Token usage:**

```python
ai_msg = model.invoke(messages)
ai_msg.usage_metadata
```
```python
{"input_tokens": 28, "output_tokens": 5, "total_tokens": 33}
```

**Response metadata:**

```python
ai_msg = model.invoke(messages)
ai_msg.response_metadata
```

## Extends

- `BaseChatOpenAI`

## Properties

- `model_name`
- `api_key`
- `api_base`
- `model_config`
- `lc_secrets`

## Methods

- [`validate_environment()`](https://reference.langchain.com/python/langchain-deepseek/chat_models/ChatDeepSeek/validate_environment)
- [`bind_tools()`](https://reference.langchain.com/python/langchain-deepseek/chat_models/ChatDeepSeek/bind_tools)
- [`with_structured_output()`](https://reference.langchain.com/python/langchain-deepseek/chat_models/ChatDeepSeek/with_structured_output)

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/6fb37dba71da807af60aa7b909f71f0625a666bf/libs/partners/deepseek/langchain_deepseek/chat_models.py#L47)