LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • Client
  • AsyncClient
  • Run Helpers
  • Run Trees
  • Evaluation
  • Schemas
  • Utilities
  • Wrappers
  • Anonymizer
  • Testing
  • Expect API
  • Middleware
  • Pytest Plugin
  • Deployment SDK
  • RemoteGraph
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

OverviewClientAsyncClientRun HelpersRun TreesEvaluationSchemasUtilitiesWrappersAnonymizerTestingExpect APIMiddlewarePytest PluginDeployment SDKRemoteGraph
Language
Theme
Pythonlangsmithwrappers_openaiwrap_openai
Function●Since v0.0

wrap_openai

Patch the OpenAI client to make it traceable.

Copy
wrap_openai(
  client: C,
  *,
  tracing_extra: Optional[TracingExtra] = None,
  chat_name: str = 'ChatOpenAI',
  completions_name: str = 'OpenAI'
) -> C

Supports:

  • Chat and Responses API's
  • Sync and async OpenAI clients
  • create and parse methods
  • With and without streaming
  • with_raw_response API for accessing HTTP headers

Example:

import openai
from langsmith import wrappers

# Use OpenAI client same as you normally would.
client = wrappers.wrap_openai(openai.OpenAI())

# Chat API:
messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {
        "role": "user",
        "content": "What physics breakthroughs do you predict will happen by 2300?",
    },
]
completion = client.chat.completions.create(
    model="gpt-4o-mini", messages=messages
)
print(completion.choices[0].message.content)

# Responses API:
response = client.responses.create(
    model="gpt-4o-mini",
    messages=messages,
)
print(response.output_text)

# With raw response to access headers:
raw_response = client.chat.completions.with_raw_response.create(
    model="gpt-4o-mini", messages=messages
)
print(raw_response.headers)  # Access HTTP headers
completion = raw_response.parse()  # Get parsed response
Behavior changed in langsmith 0.3.16

Support for Responses API added.

Behavior changed in langsmith 0.3.x

Support for with_raw_response API added.

Used in Docs

  • Add metadata and tags to traces
  • Configure threads
  • Custom instrumentation
  • How to simulate multi-turn interactions
  • Log multimodal traces

Parameters

NameTypeDescription
client*C

The client to patch.

tracing_extraOptional[TracingExtra]
Default:None

Extra tracing information.

chat_namestr
Default:'ChatOpenAI'

The run name for the chat completions endpoint.

completions_namestr
Default:'OpenAI'

The run name for the completions endpoint.

View source on GitHub