OptionalmaxOptionaltemperatureOptionaltopOptionaltopOptionaltrimStaticinputsProtected_Protected_Protected_StaticinitializeInitializes the llama_cpp model for usage in the chat models wrapper.
the inputs passed onto the model.
A Promise that resolves to the ChatLlamaCpp type class.
Staticlc_
To use this model you need to have the
node-llama-cppmodule installed. This can be installed usingnpm install -S node-llama-cppand the minimum version supported in version 2.0.0. This also requires that have a locally built version of Llama3 installed.Example