Skip to content

langchain-deepseek

Modules:

Name Description
chat_models

DeepSeek chat models.

Classes:

Name Description
ChatDeepSeek

DeepSeek chat model integration to access models hosted in DeepSeek's API.

ChatDeepSeek

Bases: BaseChatOpenAI

DeepSeek chat model integration to access models hosted in DeepSeek's API.

Setup

Install langchain-deepseek and set environment variable DEEPSEEK_API_KEY.

.. code-block:: bash

pip install -U langchain-deepseek
export DEEPSEEK_API_KEY="your-api-key"

Key init args — completion params: model: str Name of DeepSeek model to use, e.g. "deepseek-chat". temperature: float Sampling temperature. max_tokens: Optional[int] Max number of tokens to generate.

Key init args — client params: timeout: Optional[float] Timeout for requests. max_retries: int Max number of retries. api_key: Optional[str] DeepSeek API key. If not passed in will be read from env var DEEPSEEK_API_KEY.

See full list of supported init args and their descriptions in the params section.

Instantiate

.. code-block:: python

from langchain_deepseek import ChatDeepSeek

llm = ChatDeepSeek(
    model="...",
    temperature=0,
    max_tokens=None,
    timeout=None,
    max_retries=2,
    # api_key="...",
    # other params...
)
Invoke

.. code-block:: python

messages = [
    ("system", "You are a helpful translator. Translate the user sentence to French."),
    ("human", "I love programming."),
]
llm.invoke(messages)
Stream

.. code-block:: python

for chunk in llm.stream(messages):
    print(chunk.text, end="")

.. code-block:: python

stream = llm.stream(messages)
full = next(stream)
for chunk in stream:
    full += chunk
full
Async

.. code-block:: python

await llm.ainvoke(messages)

# stream:
# async for chunk in (await llm.astream(messages))

# batch:
# await llm.abatch([messages])
Tool calling

.. code-block:: python

from pydantic import BaseModel, Field


class GetWeather(BaseModel):
    '''Get the current weather in a given location'''

    location: str = Field(..., description="The city and state, e.g. San Francisco, CA")


class GetPopulation(BaseModel):
    '''Get the current population in a given location'''

    location: str = Field(..., description="The city and state, e.g. San Francisco, CA")


llm_with_tools = llm.bind_tools([GetWeather, GetPopulation])
ai_msg = llm_with_tools.invoke("Which city is hotter today and which is bigger: LA or NY?")
ai_msg.tool_calls

See ChatDeepSeek.bind_tools() method for more.

Structured output

.. code-block:: python

from typing import Optional

from pydantic import BaseModel, Field


class Joke(BaseModel):
    '''Joke to tell user.'''

    setup: str = Field(description="The setup of the joke")
    punchline: str = Field(description="The punchline to the joke")
    rating: Optional[int] = Field(description="How funny the joke is, from 1 to 10")


structured_llm = llm.with_structured_output(Joke)
structured_llm.invoke("Tell me a joke about cats")

See ChatDeepSeek.with_structured_output() for more.

Token usage

.. code-block:: python

ai_msg = llm.invoke(messages)
ai_msg.usage_metadata

.. code-block:: python

{"input_tokens": 28, "output_tokens": 5, "total_tokens": 33}

Response metadata .. code-block:: python

    ai_msg = llm.invoke(messages)
    ai_msg.response_metadata

Methods:

Name Description
with_structured_output

Model wrapper that returns outputs formatted to match the given schema.

Attributes:

Name Type Description
model_name str

The name of the model

api_key Optional[SecretStr]

DeepSeek API key

api_base str

DeepSeek API base URL

lc_secrets dict[str, str]

A map of constructor argument names to secret ids.

model_name class-attribute instance-attribute

model_name: str = Field(alias='model')

The name of the model

api_key class-attribute instance-attribute

api_key: Optional[SecretStr] = Field(
    default_factory=secret_from_env(
        "DEEPSEEK_API_KEY", default=None
    )
)

DeepSeek API key

api_base class-attribute instance-attribute

api_base: str = Field(
    default_factory=from_env(
        "DEEPSEEK_API_BASE", default=DEFAULT_API_BASE
    )
)

DeepSeek API base URL

lc_secrets property

lc_secrets: dict[str, str]

A map of constructor argument names to secret ids.

with_structured_output

with_structured_output(
    schema: Optional[_DictOrPydanticClass] = None,
    *,
    method: Literal[
        "function_calling", "json_mode", "json_schema"
    ] = "function_calling",
    include_raw: bool = False,
    strict: Optional[bool] = None,
    **kwargs: Any
) -> Runnable[LanguageModelInput, _DictOrPydantic]

Model wrapper that returns outputs formatted to match the given schema.

Parameters:

Name Type Description Default
schema Optional[_DictOrPydanticClass]

The output schema. Can be passed in as:

  • an OpenAI function/tool schema,
  • a JSON Schema,
  • a TypedDict class (support added in 0.1.20),
  • or a Pydantic class.

If schema is a Pydantic class then the model output will be a Pydantic instance of that class, and the model-generated fields will be validated by the Pydantic class. Otherwise the model output will be a dict and will not be validated. See langchain_core.utils.function_calling.convert_to_openai_tool for more on how to properly specify types and descriptions of schema fields when specifying a Pydantic or TypedDict class.

None
method Literal['function_calling', 'json_mode', 'json_schema']

The method for steering model generation, one of:

  • 'function_calling': Uses DeepSeek's tool-calling features <https://api-docs.deepseek.com/guides/function_calling>_.
  • 'json_mode': Uses DeepSeek's JSON mode feature <https://api-docs.deepseek.com/guides/json_mode>_.

Behavior changed in 0.1.3

Added support for 'json_mode'.

'function_calling'
include_raw bool

If False then only the parsed structured output is returned. If an error occurs during model output parsing it will be raised. If True then both the raw model response (a BaseMessage) and the parsed model response will be returned. If an error occurs during output parsing it will be caught and returned as well. The final output is always a dict with keys 'raw', 'parsed', and 'parsing_error'.

False
strict Optional[bool]

Whether to enable strict schema adherence when generating the function call. This parameter is included for compatibility with other chat models, and if specified will be passed to the Chat Completions API in accordance with the OpenAI API specification. However, the DeepSeek API may ignore the parameter.

None
kwargs Any

Additional keyword args aren't supported.

{}

Returns:

Type Description
Runnable[LanguageModelInput, _DictOrPydantic]

A Runnable that takes same inputs as a langchain_core.language_models.chat.BaseChatModel.

Runnable[LanguageModelInput, _DictOrPydantic]

If include_raw is False and schema is a Pydantic class, Runnable outputs

Runnable[LanguageModelInput, _DictOrPydantic]

an instance of schema (i.e., a Pydantic object). Otherwise, if include_raw is False then Runnable outputs a dict.

Runnable[LanguageModelInput, _DictOrPydantic]

If include_raw is True, then Runnable outputs a dict with keys:

Runnable[LanguageModelInput, _DictOrPydantic]
  • 'raw': BaseMessage
Runnable[LanguageModelInput, _DictOrPydantic]
  • 'parsed': None if there was a parsing error, otherwise the type depends on the schema as described above.
Runnable[LanguageModelInput, _DictOrPydantic]
  • 'parsing_error': Optional[BaseException]