# set_llm_cache

> **Function** in `langchain_core`

📖 [View in docs](https://reference.langchain.com/python/langchain-core/globals/set_llm_cache)

Set a new LLM cache, overwriting the previous value, if any.

## Signature

```python
set_llm_cache(
    value: Optional[BaseCache],
) -> None
```

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `value` | `Optional[BaseCache]` | Yes | The new LLM cache to use. If `None`, the LLM cache is disabled. |

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/f0c5a28fa05adcda89aebcb449d897245ab21fa4/libs/core/langchain_core/globals.py#L56)