Baseten model
This module allows using LLMs hosted on Baseten.
The LLM deployed on Baseten must have the following properties:
- Must accept input as a dictionary with the key "prompt"
- May accept other input in the dictionary passed through with kwargs
- Must return a string with the model output
To use this module, you must:
- Export your Baseten API key as the environment variable
BASETEN_API_KEY
- Get the model ID for your model from your Baseten dashboard
- Identify the model deployment ("production" for all model library models)
These code samples use
Mistral 7B Instruct
from Baseten's model library.