# wrap_gemini

> **Function** in `langsmith`

📖 [View in docs](https://reference.langchain.com/python/langsmith/wrappers/_gemini/wrap_gemini)

Patch the Google Gen AI client to make it traceable.

!!! warning

    **BETA**: This wrapper is in beta.

## Signature

```python
wrap_gemini(
    client: C,
    *,
    tracing_extra: Optional[TracingExtra] = None,
    chat_name: str = 'ChatGoogleGenerativeAI',
) -> C
```

## Description

**Supports:**

- `generate_content` and `generate_content_stream` methods
- Sync and async clients
- Streaming and non-streaming responses
- Tool/function calling with proper UI rendering
- Multimodal inputs (text + images)
- Image generation with `inline_data` support
- Token usage tracking including reasoning tokens

**Example:**

```python
from google import genai
from google.genai import types
from langsmith import wrappers

# Use Google Gen AI client same as you normally would.
client = wrappers.wrap_gemini(genai.Client(api_key="your-api-key"))

# Basic text generation:
response = client.models.generate_content(
    model="gemini-2.5-flash",
    contents="Why is the sky blue?",
)
print(response.text)

# Streaming:
for chunk in client.models.generate_content_stream(
    model="gemini-2.5-flash",
    contents="Tell me a story",
):
    print(chunk.text, end="")

# Tool/Function calling:
schedule_meeting_function = {
    "name": "schedule_meeting",
    "description": "Schedules a meeting with specified attendees.",
    "parameters": {
        "type": "object",
        "properties": {
            "attendees": {"type": "array", "items": {"type": "string"}},
            "date": {"type": "string"},
            "time": {"type": "string"},
            "topic": {"type": "string"},
        },
        "required": ["attendees", "date", "time", "topic"],
    },
}

tools = types.Tool(function_declarations=[schedule_meeting_function])
config = types.GenerateContentConfig(tools=[tools])

response = client.models.generate_content(
    model="gemini-2.5-flash",
    contents="Schedule a meeting with Bob and Alice tomorrow at 2 PM.",
    config=config,
)

# Image generation:
response = client.models.generate_content(
    model="gemini-2.5-flash-image",
    contents=["Create a picture of a futuristic city"],
)

# Save generated image
from io import BytesIO
from PIL import Image

for part in response.candidates[0].content.parts:
    if part.inline_data is not None:
        image = Image.open(BytesIO(part.inline_data.data))
        image.save("generated_image.png")
```

!!! version-added "Added in `langsmith` 0.4.33"

Initial beta release of Google Gemini wrapper.

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `client` | `C` | Yes | The Google Gen AI client to patch. |
| `tracing_extra` | `Optional[TracingExtra]` | No | Extra tracing information. (default: `None`) |
| `chat_name` | `str` | No | The run name for the chat endpoint. (default: `'ChatGoogleGenerativeAI'`) |

## Returns

`C`

The patched client.

---

[View source on GitHub](https://github.com/langchain-ai/langsmith-sdk/blob/fcda9320ff067c3d3857e9e3d088fc1eb0643fc4/python/langsmith/wrappers/_gemini.py#L517)