# PromptCache

> **Class** in `langsmith`

📖 [View in docs](https://reference.langchain.com/python/langsmith/prompt_cache/PromptCache)

Thread-safe LRU cache with background thread refresh.

For use with the synchronous Client.

Features:
- In-memory LRU cache with configurable max size
- Background thread for refreshing stale entries
- Stale-while-revalidate: returns stale data while refresh happens
- Thread-safe for concurrent access

## Signature

```python
PromptCache(
    self,
    *,
    max_size: int = DEFAULT_PROMPT_CACHE_MAX_SIZE,
    ttl_seconds: Optional[float] = DEFAULT_PROMPT_CACHE_TTL_SECONDS,
    refresh_interval_seconds: float = DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS,
)
```

## Description

**Example:**

>>> def fetch_prompt(key: str) -> PromptCommit:
...     return client._fetch_prompt_from_api(key)
>>> cache = PromptCache(
...     max_size=100,
...     ttl_seconds=3600,
...     fetch_func=fetch_prompt,
... )
>>> cache.set("my-prompt:latest", prompt_commit)
>>> cached = cache.get("my-prompt:latest")
>>> cache.shutdown()

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `max_size` | `int` | No | Maximum entries in cache (LRU eviction when exceeded). (default: `DEFAULT_PROMPT_CACHE_MAX_SIZE`) |
| `ttl_seconds` | `Optional[float]` | No | Time before entry is considered stale. Set to None for infinite TTL (offline mode - entries never expire). Default: 300 (5 minutes). (default: `DEFAULT_PROMPT_CACHE_TTL_SECONDS`) |
| `refresh_interval_seconds` | `float` | No | How often to check for stale entries. (default: `DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS`) |

## Extends

- `_BasePromptCache`

## Constructors

```python
__init__(
    self,
    *,
    max_size: int = DEFAULT_PROMPT_CACHE_MAX_SIZE,
    ttl_seconds: Optional[float] = DEFAULT_PROMPT_CACHE_TTL_SECONDS,
    refresh_interval_seconds: float = DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS,
) -> None
```

| Name | Type |
|------|------|
| `max_size` | `int` |
| `ttl_seconds` | `Optional[float]` |
| `refresh_interval_seconds` | `float` |


## Methods

- [`set()`](https://reference.langchain.com/python/langsmith/prompt_cache/PromptCache/set)
- [`stop()`](https://reference.langchain.com/python/langsmith/prompt_cache/PromptCache/stop)
- [`shutdown()`](https://reference.langchain.com/python/langsmith/prompt_cache/PromptCache/shutdown)
- [`configure()`](https://reference.langchain.com/python/langsmith/prompt_cache/PromptCache/configure)

---

[View source on GitHub](https://github.com/langchain-ai/langsmith-sdk/blob/19dc497a3d89638e4cc35db72ea1c29cad35cbbf/python/langsmith/prompt_cache.py#L305)