# LlamaCpp

> **Class** in `langchain_community`

📖 [View in docs](https://reference.langchain.com/python/langchain-community/llms/llamacpp/LlamaCpp)

llama.cpp model.

To use, you should have the llama-cpp-python library installed, and provide the
path to the Llama model as a named parameter to the constructor.
Check out: https://github.com/abetlen/llama-cpp-python

## Signature

```python
LlamaCpp()
```

## Description

**Example:**

.. code-block:: python

from langchain_community.llms import LlamaCpp
llm = LlamaCpp(model_path="/path/to/llama/model")

## Extends

- `LLM`

## Properties

- `client`
- `model_path`
- `lora_base`
- `lora_path`
- `n_ctx`
- `n_parts`
- `seed`
- `f16_kv`
- `logits_all`
- `vocab_only`
- `use_mlock`
- `n_threads`
- `n_batch`
- `n_gpu_layers`
- `suffix`
- `max_tokens`
- `temperature`
- `top_p`
- `logprobs`
- `echo`
- `stop`
- `repeat_penalty`
- `top_k`
- `last_n_tokens_size`
- `use_mmap`
- `rope_freq_scale`
- `rope_freq_base`
- `model_kwargs`
- `streaming`
- `grammar_path`
- `grammar`
- `verbose`

## Methods

- [`validate_environment()`](https://reference.langchain.com/python/langchain-community/llms/llamacpp/LlamaCpp/validate_environment)
- [`build_model_kwargs()`](https://reference.langchain.com/python/langchain-community/llms/llamacpp/LlamaCpp/build_model_kwargs)
- [`get_num_tokens()`](https://reference.langchain.com/python/langchain-community/llms/llamacpp/LlamaCpp/get_num_tokens)

---

[View source on GitHub](https://github.com/langchain-ai/langchain-community/blob/a6a6079511ac8a5c1293337f88096b8641562e77/libs/community/langchain_community/llms/llamacpp.py#L17)