Thread-safe LRU cache with background thread refresh.
For use with the synchronous Client.
Features:
PromptCache(
self,
*,
max_size: int = DEFAULT_PROMPT_CACHE_MAX_SIZE,
ttl_seconds: Optional[float] = DEFAULT_PROMPT_CACHE_TTL_SECONDS,
refresh_interval_seconds: float = DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS
)_BasePromptCacheExample:
def fetch_prompt(key: str) -> PromptCommit: ... return client._fetch_prompt_from_api(key) cache = PromptCache( ... max_size=100, ... ttl_seconds=3600, ... fetch_func=fetch_prompt, ... ) cache.set("my-prompt:latest", prompt_commit) cached = cache.get("my-prompt:latest") cache.shutdown()
| Name | Type | Description |
|---|---|---|
max_size | int | Default: DEFAULT_PROMPT_CACHE_MAX_SIZEMaximum entries in cache (LRU eviction when exceeded). |
ttl_seconds | Optional[float] | Default: DEFAULT_PROMPT_CACHE_TTL_SECONDSTime before entry is considered stale. Set to None for infinite TTL (offline mode - entries never expire). Default: 300 (5 minutes). |
refresh_interval_seconds | float | Default: DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDSHow often to check for stale entries. |