langchain-deepseek¶
Reference docs
This page contains reference documentation for DeepSeek. See the docs for conceptual guides, tutorials, and examples on using ChatDeepSeek.
langchain_deepseek
¶
LangChain DeepSeek integration.
ChatDeepSeek
¶
Bases: BaseChatOpenAI
DeepSeek chat model integration to access models hosted in DeepSeek's API.
Setup
Install langchain-deepseek and set environment variable DEEPSEEK_API_KEY.
Key init args — completion params:
model:
Name of DeepSeek model to use, e.g. 'deepseek-chat'.
temperature:
Sampling temperature.
max_tokens:
Max number of tokens to generate.
Key init args — client params:
timeout:
Timeout for requests.
max_retries:
Max number of retries.
api_key:
DeepSeek API key. If not passed in will be read from env var DEEPSEEK_API_KEY.
See full list of supported init args and their descriptions in the params section.
Instantiate
Invoke
Stream
Async
Tool calling
from pydantic import BaseModel, Field
class GetWeather(BaseModel):
'''Get the current weather in a given location'''
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
class GetPopulation(BaseModel):
'''Get the current population in a given location'''
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
model_with_tools = model.bind_tools([GetWeather, GetPopulation])
ai_msg = model_with_tools.invoke("Which city is hotter today and which is bigger: LA or NY?")
ai_msg.tool_calls
See ChatDeepSeek.bind_tools() method for more.
Structured output
from typing import Optional
from pydantic import BaseModel, Field
class Joke(BaseModel):
'''Joke to tell user.'''
setup: str = Field(description="The setup of the joke")
punchline: str = Field(description="The punchline to the joke")
rating: int | None = Field(description="How funny the joke is, from 1 to 10")
structured_model = model.with_structured_output(Joke)
structured_model.invoke("Tell me a joke about cats")
See ChatDeepSeek.with_structured_output() for more.
Token usage
| METHOD | DESCRIPTION |
|---|---|
validate_environment |
Validate necessary environment vars and client params. |
bind_tools |
Bind tool-like objects to this chat model. |
with_structured_output |
Model wrapper that returns outputs formatted to match the given schema. |
model_name
class-attribute
instance-attribute
¶
The name of the model
api_key
class-attribute
instance-attribute
¶
api_key: SecretStr | None = Field(
default_factory=secret_from_env("DEEPSEEK_API_KEY", default=None)
)
DeepSeek API key
api_base
class-attribute
instance-attribute
¶
DeepSeek API base URL
validate_environment
¶
validate_environment() -> Self
Validate necessary environment vars and client params.
bind_tools
¶
bind_tools(
tools: Sequence[dict[str, Any] | type | Callable | BaseTool],
*,
tool_choice: dict | str | bool | None = None,
strict: bool | None = None,
parallel_tool_calls: bool | None = None,
**kwargs: Any,
) -> Runnable[LanguageModelInput, AIMessage]
Bind tool-like objects to this chat model.
Overrides parent to use beta endpoint when strict=True.
| PARAMETER | DESCRIPTION |
|---|---|
tools
|
A list of tool definitions to bind to this chat model. |
tool_choice
|
Which tool to require the model to call. |
strict
|
If True, uses beta API for strict schema validation.
TYPE:
|
parallel_tool_calls
|
Set to
TYPE:
|
**kwargs
|
Additional parameters passed to parent
TYPE:
|
| RETURNS | DESCRIPTION |
|---|---|
Runnable[LanguageModelInput, AIMessage]
|
A Runnable that takes same inputs as a chat model. |
with_structured_output
¶
with_structured_output(
schema: _DictOrPydanticClass | None = None,
*,
method: Literal[
"function_calling", "json_mode", "json_schema"
] = "function_calling",
include_raw: bool = False,
strict: bool | None = None,
**kwargs: Any,
) -> Runnable[LanguageModelInput, _DictOrPydantic]
Model wrapper that returns outputs formatted to match the given schema.
| PARAMETER | DESCRIPTION |
|---|---|
schema
|
The output schema. Can be passed in as:
If See
TYPE:
|
method
|
The method for steering model generation, one of:
TYPE:
|
include_raw
|
If If an error occurs during model output parsing it will be raised. If If an error occurs during output parsing it will be caught and returned as well. The final output is always a
TYPE:
|
strict
|
Whether to enable strict schema adherence when generating the function
call. When set to Note DeepSeek's strict mode requires all object properties to be marked as required in the schema.
TYPE:
|
kwargs
|
Additional keyword args aren't supported.
TYPE:
|
| RETURNS | DESCRIPTION |
|---|---|
Runnable[LanguageModelInput, _DictOrPydantic]
|
A If
|