ChatLlamaCpp()The number of logprobs to return. If None, no logprobs are returned.
llama.cpp model.
To use, you should have the llama-cpp-python library installed, and provide the path to the Llama model as a named parameter to the constructor. Check out: https://github.com/abetlen/llama-cpp-python