Thread-safe LRU cache with asyncio task refresh.
For use with the asynchronous AsyncClient.
Features:
AsyncPromptCache(
self,
*,
max_size: int = DEFAULT_PROMPT_CACHE_MAX_SIZE,
ttl_seconds: Optional[float] = DEFAULT_PROMPT_CACHE_TTL_SECONDS,
refresh_interval_seconds: float = DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS
)_BasePromptCacheExample:
async def fetch_prompt(key: str) -> PromptCommit: ... return await client._afetch_prompt_from_api(key) cache = AsyncPromptCache( ... max_size=100, ... ttl_seconds=3600, ... fetch_func=fetch_prompt, ... ) await cache.start() cache.set("my-prompt:latest", prompt_commit) cached = cache.get("my-prompt:latest") await cache.stop()
| Name | Type | Description |
|---|---|---|
max_size | int | Default: DEFAULT_PROMPT_CACHE_MAX_SIZEMaximum entries in cache (LRU eviction when exceeded). |
ttl_seconds | Optional[float] | Default: DEFAULT_PROMPT_CACHE_TTL_SECONDSTime before entry is considered stale. Set to None for infinite TTL (offline mode - entries never expire). |
refresh_interval_seconds | float | Default: DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDSHow often to check for stale entries. |
Set a value in the cache.
Start async background refresh loop.
Must be called from an async context. Creates an asyncio task that periodically checks for stale entries and refreshes them. Does nothing if ttl_seconds is None (infinite TTL mode).
Stop background refresh task.
Synchronous wrapper that cancels the refresh task. For proper cleanup in async context, use stop() instead.
Stop async background refresh loop.
Cancels the refresh task and waits for it to complete.
Reconfigure the cache parameters.