# SambaNovaCloud

> **Class** in `langchain_community`

📖 [View in docs](https://reference.langchain.com/python/langchain-community/llms/sambanova/SambaNovaCloud)

SambaNova Cloud large language models.

## Signature

```python
SambaNovaCloud(
    self,
    **kwargs: Any = {},
)
```

## Description

**Setup:**

To use, you should have the environment variables:
``SAMBANOVA_URL`` set with SambaNova Cloud URL.
defaults to http://cloud.sambanova.ai/
``SAMBANOVA_API_KEY`` set with your SambaNova Cloud API Key.
Example:
.. code-block:: python
    from langchain_community.llms.sambanova  import SambaNovaCloud
    SambaNovaCloud(
        sambanova_api_key="your-SambaNovaCloud-API-key,
        model = model name,
        max_tokens = max number of tokens to generate,
        temperature = model temperature,
        top_p = model top p,
        top_k = model top k
    )

Key init args — completion params:
    model: str
        The name of the model to use, e.g., Meta-Llama-3-70B-Instruct-4096
        (set for CoE endpoints).
    streaming: bool
        Whether to use streaming handler when using non streaming methods
    max_tokens: int
        max tokens to generate
    temperature: float
        model temperature
    top_p: float
        model top p
    top_k: int
        model top k

Key init args — client params:
    sambanova_url: str
        SambaNovaCloud Url defaults to http://cloud.sambanova.ai/
    sambanova_api_key: str
        SambaNovaCloud api key
Instantiate:
    .. code-block:: python
        from langchain_community.llms.sambanova  import SambaNovaCloud
        SambaNovaCloud(
            sambanova_api_key="your-SambaNovaCloud-API-key,
            model = model name,
            max_tokens = max number of tokens to generate,
            temperature = model temperature,
            top_p = model top p,
            top_k = model top k
        )
Invoke:
    .. code-block:: python
        prompt = "tell me a joke"
        response = llm.invoke(prompt)
Stream:
    .. code-block:: python
    for chunk in llm.stream(prompt):
        print(chunk, end="", flush=True)
Async:
    .. code-block:: python
    response = llm.ainvoke(prompt)
    await response

## Extends

- `LLM`

## Constructors

```python
__init__(
    self,
    **kwargs: Any = {},
) -> None
```


## Properties

- `sambanova_url`
- `sambanova_api_key`
- `model`
- `streaming`
- `max_tokens`
- `temperature`
- `top_p`
- `top_k`
- `stream_options`
- `lc_secrets`

## Methods

- [`is_lc_serializable()`](https://reference.langchain.com/python/langchain-community/llms/sambanova/SambaNovaCloud/is_lc_serializable)

---

[View source on GitHub](https://github.com/langchain-ai/langchain-community/blob/a6a6079511ac8a5c1293337f88096b8641562e77/libs/community/langchain_community/llms/sambanova.py#L545)