# Client

> **Class** in `langsmith`

📖 [View in docs](https://reference.langchain.com/python/langsmith/client/Client)

Client for interacting with the LangSmith API.

## Signature

```python
Client(
    self,
    api_url: Optional[str] = None,
    *,
    api_key: Optional[str] = None,
    retry_config: Optional[Retry] = None,
    timeout_ms: Optional[Union[int, tuple[int, int]]] = None,
    web_url: Optional[str] = None,
    session: Optional[requests.Session] = None,
    auto_batch_tracing: bool = True,
    anonymizer: Optional[Callable[[dict], dict]] = None,
    hide_inputs: Optional[Union[Callable[[dict], dict], bool]] = None,
    hide_outputs: Optional[Union[Callable[[dict], dict], bool]] = None,
    hide_metadata: Optional[Union[Callable[[dict], dict], bool]] = None,
    omit_traced_runtime_info: bool = False,
    process_buffered_run_ops: Optional[Callable[[Sequence[dict]], Sequence[dict]]] = None,
    run_ops_buffer_size: Optional[int] = None,
    run_ops_buffer_timeout_ms: Optional[float] = None,
    info: Optional[Union[dict, ls_schemas.LangSmithInfo]] = None,
    api_urls: Optional[dict[str, str]] = None,
    otel_tracer_provider: Optional[TracerProvider] = None,
    otel_enabled: Optional[bool] = None,
    tracing_sampling_rate: Optional[float] = None,
    workspace_id: Optional[str] = None,
    max_batch_size_bytes: Optional[int] = None,
    headers: Optional[dict[str, str]] = None,
    tracing_error_callback: Optional[Callable[[Exception], None]] = None,
    disable_prompt_cache: bool = False,
    cache: Optional[Union[bool, PromptCache]] = None,
)
```

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `api_url` | `Optional[str]` | No | URL for the LangSmith API.  Defaults to the `LANGCHAIN_ENDPOINT` environment variable or `https://api.smith.langchain.com` if not set. (default: `None`) |
| `api_key` | `Optional[str]` | No | API key for the LangSmith API.  Defaults to the `LANGCHAIN_API_KEY` environment variable. (default: `None`) |
| `retry_config` | `Optional[Retry]` | No | Retry configuration for the `HTTPAdapter`. (default: `None`) |
| `timeout_ms` | `Optional[Union[int, tuple[int, int]]]` | No | Timeout for the `HTTPAdapter`.  Can also be a 2-tuple of `(connect timeout, read timeout)` to set them separately. (default: `None`) |
| `web_url` | `Optional[str]` | No | URL for the LangSmith web app.  Default is auto-inferred from the `ENDPOINT`. (default: `None`) |
| `session` | `Optional[requests.Session]` | No | The session to use for requests.  If `None`, a new session will be created. (default: `None`) |
| `auto_batch_tracing` | `bool` | No | Whether to automatically batch tracing. (default: `True`) |
| `anonymizer` | `Optional[Callable[[dict], dict]]` | No | A function applied for masking serialized run inputs and outputs, before sending to the API. (default: `None`) |
| `hide_inputs` | `Optional[Union[Callable[[dict], dict], bool]]` | No | Whether to hide run inputs when tracing with this client.  If `True`, hides the entire inputs.  If a function, applied to all run inputs when creating runs. (default: `None`) |
| `hide_outputs` | `Optional[Union[Callable[[dict], dict], bool]]` | No | Whether to hide run outputs when tracing with this client.  If `True`, hides the entire outputs.  If a function, applied to all run outputs when creating runs. (default: `None`) |
| `hide_metadata` | `Optional[Union[Callable[[dict], dict], bool]]` | No | Whether to hide run metadata when tracing with this client.  If `True`, hides the entire metadata.  If a function, applied to all run metadata when creating runs. (default: `None`) |
| `omit_traced_runtime_info` | `bool` | No | Whether to omit runtime information from traced runs.  If `True`, runtime information (SDK version, platform, Python version, etc.) will not be stored in the `extra.runtime` field of runs.  Defaults to `False`. (default: `False`) |
| `process_buffered_run_ops` | `Optional[Callable[[Sequence[dict]], Sequence[dict]]]` | No | A function applied to buffered run operations that allows for modification of the raw run dicts before they are converted to multipart and compressed.  Useful specifically for high throughput tracing where you need to apply a rate-limited API or other costly process to the runs before they are sent to the API.  Note that the buffer will only flush automatically when `run_ops_buffer_size` is reached or a new run is added to the buffer after `run_ops_buffer_timeout_ms` has elapsed - it will not flush outside of these conditions unless you manually call `client.flush()`, so be sure to do this before your code exits. (default: `None`) |
| `run_ops_buffer_size` | `Optional[int]` | No | Maximum number of run operations to collect in the buffer before applying `process_buffered_run_ops` and sending to the API.  Required when `process_buffered_run_ops` is provided. (default: `None`) |
| `run_ops_buffer_timeout_ms` | `Optional[float]` | No | Maximum time in milliseconds to wait before flushing the run ops buffer when new runs are added.  Defaults to `5000`.  Only used when `process_buffered_run_ops` is provided. (default: `None`) |
| `info` | `Optional[Union[dict, ls_schemas.LangSmithInfo]]` | No | The information about the LangSmith API.  If not provided, it will be fetched from the API. (default: `None`) |
| `api_urls` | `Optional[dict[str, str]]` | No | A dictionary of write API URLs and their corresponding API keys.  Useful for multi-tenant setups.  Data is only read from the first URL in the dictionary. However, ONLY Runs are written (`POST` and `PATCH`) to all URLs in the dictionary. Feedback, sessions, datasets, examples, annotation queues and evaluation results are only written to the first. (default: `None`) |
| `otel_tracer_provider` | `Optional[TracerProvider]` | No | Optional tracer provider for OpenTelemetry integration.  If not provided, a LangSmith-specific tracer provider will be used. (default: `None`) |
| `tracing_sampling_rate` | `Optional[float]` | No | The sampling rate for tracing.  If provided, overrides the `LANGCHAIN_TRACING_SAMPLING_RATE` environment variable.  Should be a float between `0` and `1`, where `1` means trace everything and `0` means trace nothing. (default: `None`) |
| `workspace_id` | `Optional[str]` | No | The workspace ID.  Required for org-scoped API keys. (default: `None`) |
| `max_batch_size_bytes` | `Optional[int]` | No | The maximum size of a batch of runs in bytes.  If not provided, the default is set by the server. (default: `None`) |
| `headers` | `Optional[dict[str, str]]` | No | Additional HTTP headers to include in all requests.  These headers will be merged with the default headers (User-Agent, Accept, x-api-key, etc.). Custom headers will not override the default required headers. (default: `None`) |
| `tracing_error_callback` | `Optional[Callable[[Exception], None]]` | No | Optional callback function to handle errors.  Called when exceptions occur during tracing operations. (default: `None`) |
| `disable_prompt_cache` | `bool` | No | Disable prompt caching for this client.  By default, prompt caching is enabled globally using a singleton cache. Set this to `True` to disable caching for this specific client instance.  To configure the global cache, use `configure_global_prompt_cache()`.  !!! example      ```python     from langsmith import Client, configure_global_prompt_cache      # Use default global cache     client = Client()      # Disable caching for this client     client_no_cache = Client(disable_prompt_cache=True)      # Configure global cache settings     configure_global_prompt_cache(max_size=200, ttl_seconds=7200)     ``` (default: `False`) |
| `cache` | `Optional[Union[bool, PromptCache]]` | No | **[Deprecated]** Control prompt caching behavior.  This parameter is deprecated. Use `configure_global_prompt_cache()` to configure caching, or `disable_prompt_cache=True` to disable it.  - `True`: Enable caching with the global singleton (default behavior) - `False`: Disable caching (equivalent to `disable_prompt_cache=True`) - `Cache(...)`/`PromptCache(...)`: Use a custom cache instance  !!! example      ```python     from langsmith import Client, Cache, configure_global_prompt_cache      # Old API (deprecated but still supported)     client = Client(cache=True)  # Use global cache     client = Client(cache=False)  # Disable cache      # Use custom cache instance     my_cache = Cache(max_size=100, ttl_seconds=3600)     client = Client(cache=my_cache)      # New API (recommended)     client = Client()  # Use global cache (default)      # Configure global cache for all clients     configure_global_prompt_cache(max_size=200, ttl_seconds=7200)      # Or disable for a specific client     client = Client(disable_prompt_cache=True)     ``` (default: `None`) |

## Constructors

```python
__init__(
    self,
    api_url: Optional[str] = None,
    *,
    api_key: Optional[str] = None,
    retry_config: Optional[Retry] = None,
    timeout_ms: Optional[Union[int, tuple[int, int]]] = None,
    web_url: Optional[str] = None,
    session: Optional[requests.Session] = None,
    auto_batch_tracing: bool = True,
    anonymizer: Optional[Callable[[dict], dict]] = None,
    hide_inputs: Optional[Union[Callable[[dict], dict], bool]] = None,
    hide_outputs: Optional[Union[Callable[[dict], dict], bool]] = None,
    hide_metadata: Optional[Union[Callable[[dict], dict], bool]] = None,
    omit_traced_runtime_info: bool = False,
    process_buffered_run_ops: Optional[Callable[[Sequence[dict]], Sequence[dict]]] = None,
    run_ops_buffer_size: Optional[int] = None,
    run_ops_buffer_timeout_ms: Optional[float] = None,
    info: Optional[Union[dict, ls_schemas.LangSmithInfo]] = None,
    api_urls: Optional[dict[str, str]] = None,
    otel_tracer_provider: Optional[TracerProvider] = None,
    otel_enabled: Optional[bool] = None,
    tracing_sampling_rate: Optional[float] = None,
    workspace_id: Optional[str] = None,
    max_batch_size_bytes: Optional[int] = None,
    headers: Optional[dict[str, str]] = None,
    tracing_error_callback: Optional[Callable[[Exception], None]] = None,
    disable_prompt_cache: bool = False,
    cache: Optional[Union[bool, PromptCache]] = None,
) -> None
```

| Name | Type |
|------|------|
| `api_url` | `Optional[str]` |
| `api_key` | `Optional[str]` |
| `retry_config` | `Optional[Retry]` |
| `timeout_ms` | `Optional[Union[int, tuple[int, int]]]` |
| `web_url` | `Optional[str]` |
| `session` | `Optional[requests.Session]` |
| `auto_batch_tracing` | `bool` |
| `anonymizer` | `Optional[Callable[[dict], dict]]` |
| `hide_inputs` | `Optional[Union[Callable[[dict], dict], bool]]` |
| `hide_outputs` | `Optional[Union[Callable[[dict], dict], bool]]` |
| `hide_metadata` | `Optional[Union[Callable[[dict], dict], bool]]` |
| `omit_traced_runtime_info` | `bool` |
| `process_buffered_run_ops` | `Optional[Callable[[Sequence[dict]], Sequence[dict]]]` |
| `run_ops_buffer_size` | `Optional[int]` |
| `run_ops_buffer_timeout_ms` | `Optional[float]` |
| `info` | `Optional[Union[dict, ls_schemas.LangSmithInfo]]` |
| `api_urls` | `Optional[dict[str, str]]` |
| `otel_tracer_provider` | `Optional[TracerProvider]` |
| `otel_enabled` | `Optional[bool]` |
| `tracing_sampling_rate` | `Optional[float]` |
| `workspace_id` | `Optional[str]` |
| `max_batch_size_bytes` | `Optional[int]` |
| `headers` | `Optional[dict[str, str]]` |
| `tracing_error_callback` | `Optional[Callable[[Exception], None]]` |
| `disable_prompt_cache` | `bool` |
| `cache` | `Optional[Union[bool, PromptCache]]` |


## Properties

- `tracing_sample_rate`
- `api_url`
- `api_key`
- `retry_config`
- `timeout_ms`
- `session`
- `compressed_traces`
- `otel_exporter`
- `tracing_queue`
- `workspace_id`
- `headers`
- `info`

## Methods

- [`request_with_retries()`](https://reference.langchain.com/python/langsmith/client/Client/request_with_retries)
- [`upload_dataframe()`](https://reference.langchain.com/python/langsmith/client/Client/upload_dataframe)
- [`upload_csv()`](https://reference.langchain.com/python/langsmith/client/Client/upload_csv)
- [`create_run()`](https://reference.langchain.com/python/langsmith/client/Client/create_run)
- [`batch_ingest_runs()`](https://reference.langchain.com/python/langsmith/client/Client/batch_ingest_runs)
- [`multipart_ingest()`](https://reference.langchain.com/python/langsmith/client/Client/multipart_ingest)
- [`update_run()`](https://reference.langchain.com/python/langsmith/client/Client/update_run)
- [`flush_compressed_traces()`](https://reference.langchain.com/python/langsmith/client/Client/flush_compressed_traces)
- [`flush()`](https://reference.langchain.com/python/langsmith/client/Client/flush)
- [`read_run()`](https://reference.langchain.com/python/langsmith/client/Client/read_run)
- [`read_thread()`](https://reference.langchain.com/python/langsmith/client/Client/read_thread)
- [`list_runs()`](https://reference.langchain.com/python/langsmith/client/Client/list_runs)
- [`list_threads()`](https://reference.langchain.com/python/langsmith/client/Client/list_threads)
- [`get_run_stats()`](https://reference.langchain.com/python/langsmith/client/Client/get_run_stats)
- [`get_run_url()`](https://reference.langchain.com/python/langsmith/client/Client/get_run_url)
- [`share_run()`](https://reference.langchain.com/python/langsmith/client/Client/share_run)
- [`unshare_run()`](https://reference.langchain.com/python/langsmith/client/Client/unshare_run)
- [`read_run_shared_link()`](https://reference.langchain.com/python/langsmith/client/Client/read_run_shared_link)
- [`run_is_shared()`](https://reference.langchain.com/python/langsmith/client/Client/run_is_shared)
- [`read_shared_run()`](https://reference.langchain.com/python/langsmith/client/Client/read_shared_run)
- [`list_shared_runs()`](https://reference.langchain.com/python/langsmith/client/Client/list_shared_runs)
- [`read_dataset_shared_schema()`](https://reference.langchain.com/python/langsmith/client/Client/read_dataset_shared_schema)
- [`share_dataset()`](https://reference.langchain.com/python/langsmith/client/Client/share_dataset)
- [`unshare_dataset()`](https://reference.langchain.com/python/langsmith/client/Client/unshare_dataset)
- [`read_shared_dataset()`](https://reference.langchain.com/python/langsmith/client/Client/read_shared_dataset)
- [`list_shared_examples()`](https://reference.langchain.com/python/langsmith/client/Client/list_shared_examples)
- [`list_shared_projects()`](https://reference.langchain.com/python/langsmith/client/Client/list_shared_projects)
- [`create_project()`](https://reference.langchain.com/python/langsmith/client/Client/create_project)
- [`update_project()`](https://reference.langchain.com/python/langsmith/client/Client/update_project)
- [`read_project()`](https://reference.langchain.com/python/langsmith/client/Client/read_project)
- [`has_project()`](https://reference.langchain.com/python/langsmith/client/Client/has_project)
- [`get_test_results()`](https://reference.langchain.com/python/langsmith/client/Client/get_test_results)
- [`list_projects()`](https://reference.langchain.com/python/langsmith/client/Client/list_projects)
- [`delete_project()`](https://reference.langchain.com/python/langsmith/client/Client/delete_project)
- [`create_dataset()`](https://reference.langchain.com/python/langsmith/client/Client/create_dataset)
- [`has_dataset()`](https://reference.langchain.com/python/langsmith/client/Client/has_dataset)
- [`read_dataset()`](https://reference.langchain.com/python/langsmith/client/Client/read_dataset)
- [`diff_dataset_versions()`](https://reference.langchain.com/python/langsmith/client/Client/diff_dataset_versions)
- [`read_dataset_openai_finetuning()`](https://reference.langchain.com/python/langsmith/client/Client/read_dataset_openai_finetuning)
- [`list_datasets()`](https://reference.langchain.com/python/langsmith/client/Client/list_datasets)
- [`delete_dataset()`](https://reference.langchain.com/python/langsmith/client/Client/delete_dataset)
- [`update_dataset_tag()`](https://reference.langchain.com/python/langsmith/client/Client/update_dataset_tag)
- [`list_dataset_versions()`](https://reference.langchain.com/python/langsmith/client/Client/list_dataset_versions)
- [`read_dataset_version()`](https://reference.langchain.com/python/langsmith/client/Client/read_dataset_version)
- [`clone_public_dataset()`](https://reference.langchain.com/python/langsmith/client/Client/clone_public_dataset)
- [`create_llm_example()`](https://reference.langchain.com/python/langsmith/client/Client/create_llm_example)
- [`create_chat_example()`](https://reference.langchain.com/python/langsmith/client/Client/create_chat_example)
- [`create_example_from_run()`](https://reference.langchain.com/python/langsmith/client/Client/create_example_from_run)
- [`update_examples_multipart()`](https://reference.langchain.com/python/langsmith/client/Client/update_examples_multipart)
- [`upload_examples_multipart()`](https://reference.langchain.com/python/langsmith/client/Client/upload_examples_multipart)
- [`upsert_examples_multipart()`](https://reference.langchain.com/python/langsmith/client/Client/upsert_examples_multipart)
- [`create_examples()`](https://reference.langchain.com/python/langsmith/client/Client/create_examples)
- [`create_example()`](https://reference.langchain.com/python/langsmith/client/Client/create_example)
- [`read_example()`](https://reference.langchain.com/python/langsmith/client/Client/read_example)
- [`list_examples()`](https://reference.langchain.com/python/langsmith/client/Client/list_examples)
- [`update_example()`](https://reference.langchain.com/python/langsmith/client/Client/update_example)
- [`update_examples()`](https://reference.langchain.com/python/langsmith/client/Client/update_examples)
- [`delete_example()`](https://reference.langchain.com/python/langsmith/client/Client/delete_example)
- [`delete_examples()`](https://reference.langchain.com/python/langsmith/client/Client/delete_examples)
- [`list_dataset_splits()`](https://reference.langchain.com/python/langsmith/client/Client/list_dataset_splits)
- [`update_dataset_splits()`](https://reference.langchain.com/python/langsmith/client/Client/update_dataset_splits)
- [`evaluate_run()`](https://reference.langchain.com/python/langsmith/client/Client/evaluate_run)
- [`aevaluate_run()`](https://reference.langchain.com/python/langsmith/client/Client/aevaluate_run)
- [`create_feedback()`](https://reference.langchain.com/python/langsmith/client/Client/create_feedback)
- [`update_feedback()`](https://reference.langchain.com/python/langsmith/client/Client/update_feedback)
- [`read_feedback()`](https://reference.langchain.com/python/langsmith/client/Client/read_feedback)
- [`list_feedback()`](https://reference.langchain.com/python/langsmith/client/Client/list_feedback)
- [`delete_feedback()`](https://reference.langchain.com/python/langsmith/client/Client/delete_feedback)
- [`create_feedback_from_token()`](https://reference.langchain.com/python/langsmith/client/Client/create_feedback_from_token)
- [`create_presigned_feedback_token()`](https://reference.langchain.com/python/langsmith/client/Client/create_presigned_feedback_token)
- [`create_presigned_feedback_tokens()`](https://reference.langchain.com/python/langsmith/client/Client/create_presigned_feedback_tokens)
- [`list_presigned_feedback_tokens()`](https://reference.langchain.com/python/langsmith/client/Client/list_presigned_feedback_tokens)
- [`list_feedback_formulas()`](https://reference.langchain.com/python/langsmith/client/Client/list_feedback_formulas)
- [`get_feedback_formula_by_id()`](https://reference.langchain.com/python/langsmith/client/Client/get_feedback_formula_by_id)
- [`create_feedback_formula()`](https://reference.langchain.com/python/langsmith/client/Client/create_feedback_formula)
- [`update_feedback_formula()`](https://reference.langchain.com/python/langsmith/client/Client/update_feedback_formula)
- [`delete_feedback_formula()`](https://reference.langchain.com/python/langsmith/client/Client/delete_feedback_formula)
- [`create_feedback_config()`](https://reference.langchain.com/python/langsmith/client/Client/create_feedback_config)
- [`list_feedback_configs()`](https://reference.langchain.com/python/langsmith/client/Client/list_feedback_configs)
- [`update_feedback_config()`](https://reference.langchain.com/python/langsmith/client/Client/update_feedback_config)
- [`delete_feedback_config()`](https://reference.langchain.com/python/langsmith/client/Client/delete_feedback_config)
- [`list_annotation_queues()`](https://reference.langchain.com/python/langsmith/client/Client/list_annotation_queues)
- [`create_annotation_queue()`](https://reference.langchain.com/python/langsmith/client/Client/create_annotation_queue)
- [`read_annotation_queue()`](https://reference.langchain.com/python/langsmith/client/Client/read_annotation_queue)
- [`update_annotation_queue()`](https://reference.langchain.com/python/langsmith/client/Client/update_annotation_queue)
- [`delete_annotation_queue()`](https://reference.langchain.com/python/langsmith/client/Client/delete_annotation_queue)
- [`add_runs_to_annotation_queue()`](https://reference.langchain.com/python/langsmith/client/Client/add_runs_to_annotation_queue)
- [`delete_run_from_annotation_queue()`](https://reference.langchain.com/python/langsmith/client/Client/delete_run_from_annotation_queue)
- [`get_run_from_annotation_queue()`](https://reference.langchain.com/python/langsmith/client/Client/get_run_from_annotation_queue)
- [`create_comparative_experiment()`](https://reference.langchain.com/python/langsmith/client/Client/create_comparative_experiment)
- [`like_prompt()`](https://reference.langchain.com/python/langsmith/client/Client/like_prompt)
- [`unlike_prompt()`](https://reference.langchain.com/python/langsmith/client/Client/unlike_prompt)
- [`list_prompts()`](https://reference.langchain.com/python/langsmith/client/Client/list_prompts)
- [`get_prompt()`](https://reference.langchain.com/python/langsmith/client/Client/get_prompt)
- [`create_prompt()`](https://reference.langchain.com/python/langsmith/client/Client/create_prompt)
- [`create_commit()`](https://reference.langchain.com/python/langsmith/client/Client/create_commit)
- [`update_prompt()`](https://reference.langchain.com/python/langsmith/client/Client/update_prompt)
- [`delete_prompt()`](https://reference.langchain.com/python/langsmith/client/Client/delete_prompt)
- [`pull_prompt_commit()`](https://reference.langchain.com/python/langsmith/client/Client/pull_prompt_commit)
- [`list_prompt_commits()`](https://reference.langchain.com/python/langsmith/client/Client/list_prompt_commits)
- [`pull_prompt()`](https://reference.langchain.com/python/langsmith/client/Client/pull_prompt)
- [`push_prompt()`](https://reference.langchain.com/python/langsmith/client/Client/push_prompt)
- [`cleanup()`](https://reference.langchain.com/python/langsmith/client/Client/cleanup)
- [`evaluate()`](https://reference.langchain.com/python/langsmith/client/Client/evaluate)
- [`aevaluate()`](https://reference.langchain.com/python/langsmith/client/Client/aevaluate)
- [`get_experiment_results()`](https://reference.langchain.com/python/langsmith/client/Client/get_experiment_results)
- [`generate_insights()`](https://reference.langchain.com/python/langsmith/client/Client/generate_insights)
- [`poll_insights()`](https://reference.langchain.com/python/langsmith/client/Client/poll_insights)
- [`get_insights_report()`](https://reference.langchain.com/python/langsmith/client/Client/get_insights_report)
- [`list_project_issues()`](https://reference.langchain.com/python/langsmith/client/Client/list_project_issues)

---

[View source on GitHub](https://github.com/langchain-ai/langsmith-sdk/blob/6a74bf5af9e542d8065af8edca54b2448f430916/python/langsmith/client.py#L683)