Hugging Face Chat Wrapper.
Hugging Face Endpoint. This works with any model that supports text generation (i.e. text completion) task.
To use this class, you should have installed the huggingface_hub package, and
the environment variable HUGGINGFACEHUB_API_TOKEN set with your API token,
or given as a named parameter to the constructor.
HuggingFace Pipeline API.
To use, you should have the transformers python package installed.
Only supports text-generation, text2text-generation, image-text-to-text,
summarization and translation for now.
Response from the TextGenInference API.
Message to send to the TextGenInference API.
Hugging Face LLM's as ChatModels.
Works with HuggingFaceTextGenInference, HuggingFaceEndpoint,
HuggingFaceHub, and HuggingFacePipeline LLMs.
Upon instantiating this class, the model_id is resolved from the url provided to the LLM, and the appropriate tokenizer is loaded from the HuggingFace Hub.