Ask a question to get started
Enter to send•Shift+Enter new line
Async look up based on prompt and llm_string.
prompt
llm_string
alookup( self, prompt: str, llm_string: str ) -> RETURN_VAL_TYPE | None
str
A string representation of the prompt.
In the case of a chat model, the prompt is a non-trivial serialization of the prompt into the language model.
A string representation of the LLM configuration.