# GoogleGenerativeAIEmbeddings

> **Class** in `langchain_google_genai`

📖 [View in docs](https://reference.langchain.com/python/langchain-google-genai/embeddings/GoogleGenerativeAIEmbeddings)

Google Generative AI Embeddings.

!!! warning "Text-only"

    While `gemini-embedding-2-preview` natively supports multimodal inputs
    (text, images, video, audio, and PDFs) via the Google GenAI SDK, the
    LangChain `Embeddings` interface (`embed_query` / `embed_documents`)
    currently only accepts text. For multimodal embedding use cases in the
    meantime, use the `Google GenAI SDK directly.

## Signature

```python
GoogleGenerativeAIEmbeddings()
```

## Description

**Setup:**

!!! version-added "Vertex AI Platform Support"

    Added in `langchain-google-genai` 4.0.0.

    `GoogleGenerativeAIEmbeddings` now supports both the **Gemini Developer
    API** and **Vertex AI Platform** as backend options.

**For Gemini Developer API** (simplest):

1. Set the `GOOGLE_API_KEY` environment variable (recommended), or
2. Pass your API key using the `google_api_key` kwarg

**For Vertex AI**:

Set `vertexai=True` and provide `project` (and optionally `location`).

Example:
    ```python
    from langchain_google_genai import GoogleGenerativeAIEmbeddings

    # Gemini Developer API
    embeddings = GoogleGenerativeAIEmbeddings(
        model="gemini-embedding-2-preview"
    )
    embeddings.embed_query("What's our Q1 revenue?")

    # Vertex AI
    embeddings = GoogleGenerativeAIEmbeddings(
        model="gemini-embedding-2-preview",
        project="my-project",
        vertexai=True,
    )
    ```

    **Automatic backend detection** (when `vertexai=None` / unspecified):

    1. If `GOOGLE_GENAI_USE_VERTEXAI` env var is set, uses that value
    2. If `credentials` parameter is provided, uses Vertex AI
    3. If `project` parameter is provided, uses Vertex AI
    4. Otherwise, uses Gemini Developer API

**Environment variables:**

| Variable | Purpose | Backend |
|----------|---------|---------|
| `GOOGLE_API_KEY` | API key (primary) | Both |
| `GEMINI_API_KEY` | API key (fallback) | Both |
| `GOOGLE_GENAI_USE_VERTEXAI` | Force Vertex AI (`true`/`false`) | Vertex AI |
| `GOOGLE_CLOUD_PROJECT` | GCP project ID | Vertex AI |
| `GOOGLE_CLOUD_LOCATION` | GCP region (default: `us-central1`) | Vertex AI |
| `HTTPS_PROXY` | HTTP/HTTPS proxy URL | Both |
| `SSL_CERT_FILE` | Custom SSL certificate file | Both |

`GOOGLE_API_KEY` is checked first for backwards compatibility. (`GEMINI_API_KEY`
was introduced later to better reflect the API's branding.)

**Proxy configuration:**

Set these before initializing:

```bash
export HTTPS_PROXY='http://username:password@proxy_uri:port'
export SSL_CERT_FILE='path/to/cert.pem'  # Optional: custom SSL certificate
```

For SOCKS5 proxies or advanced proxy configuration, use the `client_args`
parameter:

```python
embeddings = GoogleGenerativeAIEmbeddings(
    model="gemini-embedding-2-preview",
    client_args={"proxy": "socks5://user:pass@host:port"},
)
```

## Extends

- `BaseModel`
- `Embeddings`

## Properties

- `client`
- `model`
- `task_type`
- `google_api_key`
- `credentials`
- `vertexai`
- `project`
- `location`
- `base_url`
- `additional_headers`
- `client_args`
- `request_options`
- `output_dimensionality`
- `model_config`

## Methods

- [`embed_documents()`](https://reference.langchain.com/python/langchain-google-genai/embeddings/GoogleGenerativeAIEmbeddings/embed_documents)
- [`embed_query()`](https://reference.langchain.com/python/langchain-google-genai/embeddings/GoogleGenerativeAIEmbeddings/embed_query)
- [`aembed_documents()`](https://reference.langchain.com/python/langchain-google-genai/embeddings/GoogleGenerativeAIEmbeddings/aembed_documents)
- [`aembed_query()`](https://reference.langchain.com/python/langchain-google-genai/embeddings/GoogleGenerativeAIEmbeddings/aembed_query)

---

[View source on GitHub](https://github.com/langchain-ai/langchain-google/blob/a3f016b2a6c4af535df275545f76fa7424aa39e5/libs/genai/langchain_google_genai/embeddings.py#L23)