LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicembeddingsbaseinit_embeddings
    Function●Since v1.0

    init_embeddings

    Initialize an embeddings model from a model name and optional provider.

    Note

    Must have the integration package corresponding to the model provider installed.

    Copy
    init_embeddings(
      model: str,
      *,
      provider: str | None = None,
      **kwargs: Any = {}
    ) -> Embeddings | Runnable[Any, list[float]]
    Example Usage
    # Using a model string
    model = init_embeddings("openai:text-embedding-3-small")
    model.embed_query("Hello, world!")
    
    # Using explicit provider
    model = init_embeddings(model="text-embedding-3-small", provider="openai")
    model.embed_documents(["Hello, world!", "Goodbye, world!"])
    
    # With additional parameters
    model = init_embeddings("openai:text-embedding-3-small", api_key="sk-...")

    Used in Docs

    • Evaluate a complex agent
    • How to use a custom store
    • Memory
    • Persistence
    • What's new in LangChain v1

    Parameters

    NameTypeDescription
    model*str

    Name of the model to use.

    Can be either:

    • A model string like "openai:text-embedding-3-small"
    • Just the model name if the provider is specified separately or can be inferred.

    See supported providers under the provider arg description.

    providerstr | None
    Default:None

    Optional explicit provider name. If not specified, will attempt to parse from the model string in the model arg.

    Supported providers:

    • openai -> langchain-openai
    • azure_openai -> langchain-openai
    • bedrock -> langchain-aws
    • cohere -> langchain-cohere
    • google_genai -> langchain-google-genai
    • google_vertexai -> langchain-google-vertexai
    • huggingface -> langchain-huggingface
    • mistralai -> langchain-mistralai
    • ollama -> langchain-ollama
    **kwargsAny
    Default:{}

    Additional model-specific parameters passed to the embedding model. These vary by provider, see the provider-specific documentation for details.

    View source on GitHub