AsyncPromptCache(
self,
*,
max_size: int = DEFAULT_PROMPT_CACHE_MAX_SIZE,
ttl_seconds: Optional[float_BasePromptCache| Name | Type | Description |
|---|---|---|
max_size | int | Default: DEFAULT_PROMPT_CACHE_MAX_SIZEMaximum entries in cache (LRU eviction when exceeded). |
ttl_seconds | Optional[float] | Default: DEFAULT_PROMPT_CACHE_TTL_SECONDSTime before entry is considered stale. Set to None for infinite TTL (offline mode - entries never expire). |
refresh_interval_seconds | float | Default: DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS |
Stop async background refresh loop.
Cancels the refresh task and waits for it to complete.
Thread-safe LRU cache with asyncio task refresh.
For use with the asynchronous AsyncClient.
Features:
Example:
async def fetch_prompt(key: str) -> PromptCommit: ... return await client._afetch_prompt_from_api(key) cache = AsyncPromptCache( ... max_size=100, ... ttl_seconds=3600, ... fetch_func=fetch_prompt, ... ) await cache.start() cache.set("my-prompt:latest", prompt_commit) cached = cache.get("my-prompt:latest") await cache.stop()
How often to check for stale entries.