# ConversationTokenBufferMemory

> **Class** in `langchain_classic`

📖 [View in docs](https://reference.langchain.com/python/langchain-classic/memory/token_buffer/ConversationTokenBufferMemory)

Conversation chat memory with token limit.

Keeps only the most recent messages in the conversation under the constraint
that the total number of tokens in the conversation does not exceed a certain limit.

## Signature

```python
ConversationTokenBufferMemory()
```

## Extends

- `BaseChatMemory`

## Properties

- `human_prefix`
- `ai_prefix`
- `llm`
- `memory_key`
- `max_token_limit`
- `buffer`
- `buffer_as_str`
- `buffer_as_messages`
- `memory_variables`

## Methods

- [`load_memory_variables()`](https://reference.langchain.com/python/langchain-classic/memory/token_buffer/ConversationTokenBufferMemory/load_memory_variables)
- [`save_context()`](https://reference.langchain.com/python/langchain-classic/memory/token_buffer/ConversationTokenBufferMemory/save_context)

## ⚠️ Deprecated

Deprecated since version 0.3.1.

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/02991cb4cf2063d51a07268edafb05fe53de1826/libs/langchain/langchain_classic/memory/token_buffer.py#L11)