# AsyncPromptCache

> **Class** in `langsmith`

📖 [View in docs](https://reference.langchain.com/python/langsmith/prompt_cache/AsyncPromptCache)

Thread-safe LRU cache with asyncio task refresh.

For use with the asynchronous AsyncClient.

Features:
- In-memory LRU cache with configurable max size
- Asyncio task for refreshing stale entries
- Stale-while-revalidate: returns stale data while refresh happens
- Thread-safe for concurrent access

## Signature

```python
AsyncPromptCache(
    self,
    *,
    max_size: int = DEFAULT_PROMPT_CACHE_MAX_SIZE,
    ttl_seconds: Optional[float] = DEFAULT_PROMPT_CACHE_TTL_SECONDS,
    refresh_interval_seconds: float = DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS,
)
```

## Description

**Example:**

>>> async def fetch_prompt(key: str) -> PromptCommit:
...     return await client._afetch_prompt_from_api(key)
>>> cache = AsyncPromptCache(
...     max_size=100,
...     ttl_seconds=3600,
...     fetch_func=fetch_prompt,
... )
>>> await cache.start()
>>> cache.set("my-prompt:latest", prompt_commit)
>>> cached = cache.get("my-prompt:latest")
>>> await cache.stop()

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `max_size` | `int` | No | Maximum entries in cache (LRU eviction when exceeded). (default: `DEFAULT_PROMPT_CACHE_MAX_SIZE`) |
| `ttl_seconds` | `Optional[float]` | No | Time before entry is considered stale. Set to None for infinite TTL (offline mode - entries never expire). (default: `DEFAULT_PROMPT_CACHE_TTL_SECONDS`) |
| `refresh_interval_seconds` | `float` | No | How often to check for stale entries. (default: `DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS`) |

## Extends

- `_BasePromptCache`

## Constructors

```python
__init__(
    self,
    *,
    max_size: int = DEFAULT_PROMPT_CACHE_MAX_SIZE,
    ttl_seconds: Optional[float] = DEFAULT_PROMPT_CACHE_TTL_SECONDS,
    refresh_interval_seconds: float = DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS,
) -> None
```

| Name | Type |
|------|------|
| `max_size` | `int` |
| `ttl_seconds` | `Optional[float]` |
| `refresh_interval_seconds` | `float` |


## Methods

- [`aset()`](https://reference.langchain.com/python/langsmith/prompt_cache/AsyncPromptCache/aset)
- [`start()`](https://reference.langchain.com/python/langsmith/prompt_cache/AsyncPromptCache/start)
- [`shutdown()`](https://reference.langchain.com/python/langsmith/prompt_cache/AsyncPromptCache/shutdown)
- [`stop()`](https://reference.langchain.com/python/langsmith/prompt_cache/AsyncPromptCache/stop)
- [`configure()`](https://reference.langchain.com/python/langsmith/prompt_cache/AsyncPromptCache/configure)

---

[View source on GitHub](https://github.com/langchain-ai/langsmith-sdk/blob/cf0366388873e33ef593235c1d0c7e561db79cfb/python/langsmith/prompt_cache.py#L454)