Optional
maxOptional
temperatureOptional
topOptional
topOptional
trimStatic
inputsProtected
_Protected
_Protected
_Static
initializeInitializes the llama_cpp model for usage in the chat models wrapper.
the inputs passed onto the model.
A Promise that resolves to the ChatLlamaCpp type class.
Static
lc_
To use this model you need to have the
node-llama-cpp
module installed. This can be installed usingnpm install -S node-llama-cpp
and the minimum version supported in version 2.0.0. This also requires that have a locally built version of Llama3 installed.Example