Ask a question to get started
Enter to sendā¢Shift+Enter new line
Update cache based on prompt and llm_string.
update( self, prompt: str, llm_string: str, return_val: RETURN_VAL_TYPE, wait_until_ready: Optional[float] = None ) -> None