OctoAIEndpoint()Cohere async client.
Get the namespace of the langchain object.
Build extra kwargs from additional params that were passed in.
Get the sub prompts for llm call.
Create the LLMResult from the choices and prompts.
Get the tokens present in the text with tiktoken package.
Calculate the maximum number of tokens possible to generate for a model.
Calculate the maximum number of tokens possible to generate for a prompt.
OctoAI LLM Endpoints - OpenAI compatible.
OctoAIEndpoint is a class to interact with OctoAI Compute Service large language model endpoints.
To use, you should have the environment variable OCTOAI_API_TOKEN set
with your API token, or pass it as a named parameter to the constructor.
Example:
.. code-block:: python
from langchain_community.llms.octoai_endpoint import OctoAIEndpoint
llm = OctoAIEndpoint( model="llama-2-13b-chat-fp16", max_tokens=200, presence_penalty=0, temperature=0.1, top_p=0.9, )