LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicchat_models
    Module●Since v1.0

    chat_models

    Chat Models are a variation on language models.

    While Chat Models use language models under the hood, the interface they expose is a bit different. Rather than expose a "text in, text out" API, they expose an interface where "chat messages" are the inputs and outputs.

    Functions

    function
    is_interactive_env

    Determine if running within IPython or Jupyter.

    function
    init_chat_model

    Initialize a chat model from any supported provider using a unified interface.

    Two main use cases:

    1. Fixed model – specify the model upfront and get back a ready-to-use chat model.
    2. Configurable model – choose to specify parameters (including model name) at runtime via config. Makes it easy to switch between models/providers without changing your code
    Note

    Requires the integration package for the chosen model provider to be installed.

    See the model_provider parameter below for specific package names (e.g., pip install langchain-openai).

    Refer to the provider integration's API reference for supported model parameters to use as **kwargs.

    Modules

    module
    gigachat
    module
    javelin_ai_gateway
    module
    fake
    module
    cohere
    module
    litellm
    module
    meta
    module
    google_palm
    module
    hunyuan
    module
    mlflow_ai_gateway
    module
    azureml_endpoint
    module
    minimax
    module
    openai
    module
    pai_eas_endpoint
    module
    baichuan
    module
    mlflow
    module
    anyscale
    module
    ernie
    module
    everlyai
    module
    human
    module
    fireworks
    module
    vertexai
    module
    ollama
    module
    volcengine_maas
    module
    bedrock
    module
    databricks
    module
    konko
    module
    tongyi
    module
    yandex
    module
    jinachat
    module
    anthropic
    module
    baidu_qianfan_endpoint
    module
    azure_openai
    module
    promptlayer_openai
    module
    base
    View source on GitHub