# init_embeddings

> **Function** in `langchain`

📖 [View in docs](https://reference.langchain.com/python/langchain/embeddings/base/init_embeddings)

Initialize an embedding model from a model name and optional provider.

!!! note

    Requires the integration package for the chosen model provider to be installed.

    See the `model_provider` parameter below for specific package names
    (e.g., `pip install langchain-openai`).

    Refer to the [provider integration's API reference](https://docs.langchain.com/oss/python/integrations/providers)
    for supported model parameters to use as `**kwargs`.

## Signature

```python
init_embeddings(
    model: str,
    *,
    provider: str | None = None,
    **kwargs: Any = {},
) -> Embeddings
```

## Description

???+ example

    ```python
    # pip install langchain langchain-openai

    # Using a model string
    model = init_embeddings("openai:text-embedding-3-small")
    model.embed_query("Hello, world!")

    # Using explicit provider
    model = init_embeddings(model="text-embedding-3-small", provider="openai")
    model.embed_documents(["Hello, world!", "Goodbye, world!"])

    # With additional parameters
    model = init_embeddings("openai:text-embedding-3-small", api_key="sk-...")
    ```

!!! version-added "Added in `langchain` 0.3.9"

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `model` | `str` | Yes | The name of the model, e.g. `'openai:text-embedding-3-small'`.  You can also specify model and model provider in a single argument using `'{model_provider}:{model}'` format, e.g. `'openai:text-embedding-3-small'`. |
| `provider` | `str \| None` | No | The model provider if not specified as part of the model arg (see above).  Supported `provider` values and the corresponding integration package are:  - `openai`                  -> [`langchain-openai`](https://docs.langchain.com/oss/python/integrations/providers/openai) - `azure_ai`                -> [`langchain-azure-ai`](https://docs.langchain.com/oss/python/integrations/providers/microsoft) - `azure_openai`            -> [`langchain-openai`](https://docs.langchain.com/oss/python/integrations/providers/openai) - `bedrock`                 -> [`langchain-aws`](https://docs.langchain.com/oss/python/integrations/providers/aws) - `cohere`                  -> [`langchain-cohere`](https://docs.langchain.com/oss/python/integrations/providers/cohere) - `google_vertexai`         -> [`langchain-google-vertexai`](https://docs.langchain.com/oss/python/integrations/providers/google) - `huggingface`             -> [`langchain-huggingface`](https://docs.langchain.com/oss/python/integrations/providers/huggingface) - `mistralai`               -> [`langchain-mistralai`](https://docs.langchain.com/oss/python/integrations/providers/mistralai) - `ollama`                  -> [`langchain-ollama`](https://docs.langchain.com/oss/python/integrations/providers/ollama) (default: `None`) |
| `**kwargs` | `Any` | No | Additional model-specific parameters passed to the embedding model.  These vary by provider. Refer to the specific model provider's [integration reference](https://reference.langchain.com/python/integrations/) for all available parameters. (default: `{}`) |

## Returns

`Embeddings`

An `Embeddings` instance that can generate embeddings for text.

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/8fec4e7ceee2c368b068c49f9fed453276e210e7/libs/langchain_v1/langchain/embeddings/base.py#L191)