# configure_global_async_prompt_cache

> **Function** in `langsmith`

📖 [View in docs](https://reference.langchain.com/python/langsmith/prompt_cache/configure_global_async_prompt_cache)

Configure the global prompt cache.

This should be called before any cache instances are created or used.

## Signature

```python
configure_global_async_prompt_cache(
    *,
    max_size: int = DEFAULT_PROMPT_CACHE_MAX_SIZE,
    ttl_seconds: Optional[float] = DEFAULT_PROMPT_CACHE_TTL_SECONDS,
    refresh_interval_seconds: float = DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS,
) -> None
```

## Description

**Example:**

>>> from langsmith import configure_global_prompt_cache
>>> configure_global_prompt_cache(max_size=200, ttl_seconds=7200)

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `max_size` | `int` | No | Maximum entries in cache (LRU eviction when exceeded). (default: `DEFAULT_PROMPT_CACHE_MAX_SIZE`) |
| `ttl_seconds` | `Optional[float]` | No | Time before entry is considered stale. (default: `DEFAULT_PROMPT_CACHE_TTL_SECONDS`) |
| `refresh_interval_seconds` | `float` | No | How often to check for stale entries. (default: `DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS`) |

---

[View source on GitHub](https://github.com/langchain-ai/langsmith-sdk/blob/6a74bf5af9e542d8065af8edca54b2448f430916/python/langsmith/prompt_cache.py#L646)