# AgentTokenBufferMemory

> **Class** in `langchain_classic`

📖 [View in docs](https://reference.langchain.com/python/langchain-classic/agents/openai_functions_agent/agent_token_buffer_memory/AgentTokenBufferMemory)

Memory used to save agent output AND intermediate steps.

## Signature

```python
AgentTokenBufferMemory()
```

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `human_prefix` | `unknown` | Yes | Prefix for human messages. |
| `ai_prefix` | `unknown` | Yes | Prefix for AI messages. |
| `llm` | `unknown` | Yes | Language model. |
| `memory_key` | `unknown` | Yes | Key to save memory under. |
| `max_token_limit` | `unknown` | Yes | Maximum number of tokens to keep in the buffer. Once the buffer exceeds this many tokens, the oldest messages will be pruned. |
| `return_messages` | `unknown` | Yes | Whether to return messages. |
| `output_key` | `unknown` | Yes | Key to save output under. |
| `intermediate_steps_key` | `unknown` | Yes | Key to save intermediate steps under. |
| `format_as_tools` | `unknown` | Yes | Whether to format as tools. |

## Extends

- `BaseChatMemory`

## Properties

- `human_prefix`
- `ai_prefix`
- `llm`
- `memory_key`
- `max_token_limit`
- `return_messages`
- `output_key`
- `intermediate_steps_key`
- `format_as_tools`
- `buffer`
- `memory_variables`

## Methods

- [`load_memory_variables()`](https://reference.langchain.com/python/langchain-classic/agents/openai_functions_agent/agent_token_buffer_memory/AgentTokenBufferMemory/load_memory_variables)
- [`save_context()`](https://reference.langchain.com/python/langchain-classic/agents/openai_functions_agent/agent_token_buffer_memory/AgentTokenBufferMemory/save_context)

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/02991cb4cf2063d51a07268edafb05fe53de1826/libs/langchain/langchain_classic/agents/openai_functions_agent/agent_token_buffer_memory.py#L16)