LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • Client
  • AsyncClient
  • Run Helpers
  • Run Trees
  • Evaluation
  • Schemas
  • Utilities
  • Wrappers
  • Anonymizer
  • Testing
  • Expect API
  • Middleware
  • Pytest Plugin
  • Deployment SDK
  • RemoteGraph
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

OverviewClientAsyncClientRun HelpersRun TreesEvaluationSchemasUtilitiesWrappersAnonymizerTestingExpect APIMiddlewarePytest PluginDeployment SDKRemoteGraph
Language
Theme
Pythonlangsmithwrappers_anthropicwrap_anthropic
Function●Since v0.3

wrap_anthropic

Patch the Anthropic client to make it traceable.

Copy
wrap_anthropic(
  client: C,
  *,
  tracing_extra: Optional[TracingExtra] = None,
  chat_name: str = 'ChatAnthropic',
  completions_name: str = 'Anthropic'
) -> C

Example:

import anthropic
from langsmith import wrappers

client = wrappers.wrap_anthropic(anthropic.Anthropic())

# Use Anthropic client same as you normally would:
system = "You are a helpful assistant."
messages = [
    {
        "role": "user",
        "content": "What physics breakthroughs do you predict will happen by 2300?",
    }
]
completion = client.messages.create(
    model="claude-3-5-sonnet-latest",
    messages=messages,
    max_tokens=1000,
    system=system,
)
print(completion.content)

# With raw response to access headers:
raw_response = client.messages.with_raw_response.create(
    model="claude-3-5-sonnet-latest",
    messages=messages,
    max_tokens=1000,
    system=system,
)
print(raw_response.headers)  # Access HTTP headers
message = raw_response.parse()  # Get parsed response

# You can also use the streaming context manager:
with client.messages.stream(
    model="claude-3-5-sonnet-latest",
    messages=messages,
    max_tokens=1000,
    system=system,
) as stream:
    for text in stream.text_stream:
        print(text, end="", flush=True)
    message = stream.get_final_message()

Parameters

NameTypeDescription
client*C

The client to patch.

tracing_extraOptional[TracingExtra]
Default:None

Extra tracing information.

chat_namestr
Default:'ChatAnthropic'

The run name for the messages endpoint.

completions_namestr
Default:'Anthropic'

The run name for the completions endpoint.

View source on GitHub