# ChatSnowflake

> **Class** in `langchain_snowflake`

📖 [View in docs](https://reference.langchain.com/python/langchain-snowflake/chat_models/base/ChatSnowflake)

Snowflake chat model integration using Cortex LLM functions.

This class provides access to Snowflake's Cortex Complete function with
models like llama3.1-70b, mistral-large2, claude-3-5-sonnet, and more.

## Signature

```python
ChatSnowflake(
    self,
    model: str = 'llama3.1-70b',
    session: Any = None,
    temperature: float = 0.7,
    max_tokens: int = 4096,
    top_p: float = 1.0,
    warehouse: Optional[str] = None,
    database: Optional[str] = None,
    schema: Optional[str] = None,
    account: Optional[str] = None,
    user: Optional[str] = None,
    password: Optional[SecretStr] = None,
    token: Optional[str] = None,
    private_key_path: Optional[str] = None,
    private_key_passphrase: Optional[str] = None,
    request_timeout: int = 300,
    verify_ssl: bool = True,
    disable_parallel_tool_use: bool = False,
    group_tool_messages: bool = True,
    **kwargs: Any = {},
)
```

## Description

**Setup:**

Install ``langchain-snowflake`` and configure Snowflake connection.

.. code-block:: bash

    pip install -U langchain-snowflake

Key init args — completion params:
    model: str
        Name of Snowflake Cortex model to use (e.g., 'llama3.1-70b', 'mistral-large2')
    temperature: float
        Sampling temperature (0.0 to 1.0)
    max_tokens: Optional[int]
        Max number of tokens to generate (default: 4096)

Key init args — client params:
    session: Optional[Session]
        Active Snowflake session. If not provided, will create from connection params.
    account: Optional[str]
        Snowflake account identifier
    user: Optional[str]
        Snowflake username
    password: Optional[SecretStr]
        Snowflake password
    warehouse: Optional[str]
        Snowflake warehouse to use
    database: Optional[str]
        Snowflake database to use
    schema: Optional[str]
        Snowflake schema to use
    request_timeout: int
        Request timeout in seconds for API calls (default: 300)
    verify_ssl: bool
        Whether to verify SSL certificates (default: True)

**Instantiate:**

.. code-block:: python

from .. import ChatSnowflake

# Using existing session
llm = ChatSnowflake(
    model="llama3.1-70b",
    session=session,
    temperature=0.1,
    max_tokens=1000
)

# Using connection parameters with network configuration
llm = ChatSnowflake(
    model="mistral-large2",
    account="your-account",
    user="your-user",
    password="your-password",
    warehouse="your-warehouse",
    temperature=0.0,
    request_timeout=600,  # 10 minutes for long-running operations
    verify_ssl=True       # Always verify SSL in production
)

**Invoke:**

.. code-block:: python

messages = [
    ("system", "You are a helpful assistant."),
    ("human", "What is the capital of France?"),
]
response = llm.invoke(messages)
print(response.content)

**Stream:**

.. code-block:: python

for chunk in llm.stream(messages):
    print(chunk.content, end="", flush=True)

**Async:**

.. code-block:: python

response = await llm.ainvoke(messages)
async for chunk in llm.astream(messages):
    print(chunk.content, end="", flush=True)

**Tool calling:**

.. code-block:: python

@tool
def get_weather(city: str) -> str:
    '''Get weather for a city.'''
    return f"The weather in {city} is 72°F and sunny."

llm_with_tools = llm.bind_tools([get_weather])
messages = [("human", "What's the weather in Paris?")]
response = llm_with_tools.invoke(messages)

**Structured output:**

.. code-block:: python

from typing import Literal
from pydantic import BaseModel

class Sentiment(BaseModel):
    sentiment: Literal["positive", "negative", "neutral"]
    confidence: float

structured_llm = llm.with_structured_output(Sentiment)
result = structured_llm.invoke("I love this product!")
print(result.sentiment, result.confidence)

**Response metadata:**

.. code-block:: python

response = llm.invoke(messages)
print(response.response_metadata)
# {'model': 'llama3.1-70b', 'usage': {'prompt_tokens': 10, 'completion_tokens': 5}}

## Extends

- `SnowflakeAuth`
- `SnowflakeStreaming`
- `SnowflakeTools`
- `SnowflakeStructuredOutput`
- `SnowflakeUtils`
- `BaseChatModel`

## Constructors

```python
__init__(
    self,
    model: str = 'llama3.1-70b',
    session: Any = None,
    temperature: float = 0.7,
    max_tokens: int = 4096,
    top_p: float = 1.0,
    warehouse: Optional[str] = None,
    database: Optional[str] = None,
    schema: Optional[str] = None,
    account: Optional[str] = None,
    user: Optional[str] = None,
    password: Optional[SecretStr] = None,
    token: Optional[str] = None,
    private_key_path: Optional[str] = None,
    private_key_passphrase: Optional[str] = None,
    request_timeout: int = 300,
    verify_ssl: bool = True,
    disable_parallel_tool_use: bool = False,
    group_tool_messages: bool = True,
    **kwargs: Any = {},
)
```

| Name | Type |
|------|------|
| `model` | `str` |
| `session` | `Any` |
| `temperature` | `float` |
| `max_tokens` | `int` |
| `top_p` | `float` |
| `warehouse` | `Optional[str]` |
| `database` | `Optional[str]` |
| `schema` | `Optional[str]` |
| `account` | `Optional[str]` |
| `user` | `Optional[str]` |
| `password` | `Optional[SecretStr]` |
| `token` | `Optional[str]` |
| `private_key_path` | `Optional[str]` |
| `private_key_passphrase` | `Optional[str]` |
| `request_timeout` | `int` |
| `verify_ssl` | `bool` |
| `disable_parallel_tool_use` | `bool` |
| `group_tool_messages` | `bool` |


## Properties

- `model`
- `temperature`
- `max_tokens`
- `top_p`
- `session`
- `warehouse`
- `database`
- `sf_schema`
- `account`
- `user`
- `password`
- `token`
- `private_key_path`
- `private_key_passphrase`
- `max_retries`
- `request_timeout`
- `verify_ssl`
- `ls_structured_output_format`
- `disable_parallel_tool_use`
- `group_tool_messages`

## Methods

- [`get_num_tokens()`](https://reference.langchain.com/python/langchain-snowflake/chat_models/base/ChatSnowflake/get_num_tokens)
- [`get_token_ids()`](https://reference.langchain.com/python/langchain-snowflake/chat_models/base/ChatSnowflake/get_token_ids)

---

[View source on GitHub](https://github.com/langchain-ai/langchain-snowflake/blob/fab0c716e9197e2afb7ee433491251a8ef12b9c4/libs/snowflake/langchain_snowflake/chat_models/base.py#L26)