# wrap_openai

> **Function** in `langsmith`

📖 [View in docs](https://reference.langchain.com/python/langsmith/wrappers/_openai/wrap_openai)

Patch the OpenAI client to make it traceable.

## Signature

```python
wrap_openai(
    client: C,
    *,
    tracing_extra: Optional[TracingExtra] = None,
    chat_name: str = 'ChatOpenAI',
    completions_name: str = 'OpenAI',
) -> C
```

## Description

**Supports:**

- Chat and Responses API's
- Sync and async OpenAI clients
- `create` and `parse` methods
- With and without streaming
- `with_raw_response` API for accessing HTTP headers

**Example:**

```python
import openai
from langsmith import wrappers

# Use OpenAI client same as you normally would.
client = wrappers.wrap_openai(openai.OpenAI())

# Chat API:
messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {
        "role": "user",
        "content": "What physics breakthroughs do you predict will happen by 2300?",
    },
]
completion = client.chat.completions.create(
    model="gpt-4o-mini", messages=messages
)
print(completion.choices[0].message.content)

# Responses API:
response = client.responses.create(
    model="gpt-4o-mini",
    messages=messages,
)
print(response.output_text)

# With raw response to access headers:
raw_response = client.chat.completions.with_raw_response.create(
    model="gpt-4o-mini", messages=messages
)
print(raw_response.headers)  # Access HTTP headers
completion = raw_response.parse()  # Get parsed response
```

!!! warning "Behavior changed in `langsmith` 0.3.16"

    Support for Responses API added.

!!! warning "Behavior changed in `langsmith` 0.3.x"

    Support for `with_raw_response` API added.

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `client` | `C` | Yes | The client to patch. |
| `tracing_extra` | `Optional[TracingExtra]` | No | Extra tracing information. (default: `None`) |
| `chat_name` | `str` | No | The run name for the chat completions endpoint. (default: `'ChatOpenAI'`) |
| `completions_name` | `str` | No | The run name for the completions endpoint. (default: `'OpenAI'`) |

## Returns

`C`

The patched client.

---

[View source on GitHub](https://github.com/langchain-ai/langsmith-sdk/blob/19dc497a3d89638e4cc35db72ea1c29cad35cbbf/python/langsmith/wrappers/_openai.py#L431)