Ask a question to get started
Enter to sendā¢Shift+Enter new line
Lookup llm generations in cache by prompt and associated model and settings.
lookup( self, prompt: str, llm_string: str ) -> Optional[RETURN_VAL_TYPE]
prompt
str
The prompt run through the language model.
llm_string
The language model version and settings.