OpenLLM()Model name to use.
Get the namespace of the langchain object.
Build extra kwargs from additional params that were passed in.
Validate that api key and python package exists in environment.
Get the sub prompts for llm call.
Create the LLMResult from the choices and prompts.
Get the token IDs using the tiktoken package.
Calculate the maximum number of tokens possible to generate for a model.
Calculate the maximum number of tokens possible to generate for a prompt.
OpenAI's compatible API client for OpenLLM server
.. versionchanged:: 0.2.11
Changed in 0.2.11 to support OpenLLM 0.6. Now behaves similar to OpenAI wrapper.