# init_embeddings

> **Function** in `langchain_classic`

📖 [View in docs](https://reference.langchain.com/python/langchain-classic/embeddings/base/init_embeddings)

Initialize an embeddings model from a model name and optional provider.

!!! note
    Must have the integration package corresponding to the model provider
    installed.

## Signature

```python
init_embeddings(
    model: str,
    *,
    provider: str | None = None,
    **kwargs: Any = {},
) -> Embeddings | Runnable[Any, list[float]]
```

## Description

???+ note "Example Usage"

    ```python
    # Using a model string
    model = init_embeddings("openai:text-embedding-3-small")
    model.embed_query("Hello, world!")

    # Using explicit provider
    model = init_embeddings(model="text-embedding-3-small", provider="openai")
    model.embed_documents(["Hello, world!", "Goodbye, world!"])

    # With additional parameters
    model = init_embeddings("openai:text-embedding-3-small", api_key="sk-...")
    ```

!!! version-added "Added in `langchain` 0.3.9"

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `model` | `str` | Yes | Name of the model to use.  Can be either:  - A model string like `"openai:text-embedding-3-small"` - Just the model name if the provider is specified separately or can be     inferred.  See supported providers under the `provider` arg description. |
| `provider` | `str \| None` | No | Optional explicit provider name. If not specified, will attempt to parse from the model string in the `model` arg.  Supported providers:  - `openai`                  -> [`langchain-openai`](https://docs.langchain.com/oss/python/integrations/providers/openai) - `azure_ai`                -> [`langchain-azure-ai`](https://docs.langchain.com/oss/python/integrations/providers/microsoft) - `azure_openai`            -> [`langchain-openai`](https://docs.langchain.com/oss/python/integrations/providers/openai) - `bedrock`                 -> [`langchain-aws`](https://docs.langchain.com/oss/python/integrations/providers/aws) - `cohere`                  -> [`langchain-cohere`](https://docs.langchain.com/oss/python/integrations/providers/cohere) - `google_genai`            -> [`langchain-google-genai`](https://docs.langchain.com/oss/python/integrations/providers/google) - `google_vertexai`         -> [`langchain-google-vertexai`](https://docs.langchain.com/oss/python/integrations/providers/google) - `huggingface`             -> [`langchain-huggingface`](https://docs.langchain.com/oss/python/integrations/providers/huggingface) - `mistralai`               -> [`langchain-mistralai`](https://docs.langchain.com/oss/python/integrations/providers/mistralai) - `ollama`                  -> [`langchain-ollama`](https://docs.langchain.com/oss/python/integrations/providers/ollama) (default: `None`) |
| `**kwargs` | `Any` | No | Additional model-specific parameters passed to the embedding model. These vary by provider, see the provider-specific documentation for details. (default: `{}`) |

## Returns

`Embeddings | Runnable[Any, list[float]]`

An `Embeddings` instance that can generate embeddings for text.

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/9f232caa7a8fe1ca042a401942d5d90d54ceb1a6/libs/langchain/langchain_classic/embeddings/base.py#L130)