LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicagentsopenai_functions_agentagent_token_buffer_memoryAgentTokenBufferMemory
    Class●Since v1.0

    AgentTokenBufferMemory

    Copy
    AgentTokenBufferMemory()

    Bases

    BaseChatMemory

    Attributes

    Methods

    Inherited fromBaseChatMemory

    Attributes

    Achat_memory: BaseChatMessageHistoryAinput_key: str | None
    —

    The key from the model Run's inputs to use as the eval input.

    Methods

    Masave_context
    —

    Async save the context of this chain run to memory.

    View source on GitHub
    M
    clear
    —

    Clear memory contents.

    Maclear
    —

    Async clear memory contents.

    Inherited fromBaseMemory

    Attributes

    Amodel_config

    Methods

    Maload_memory_variables
    —

    Async return key-value pairs given the text input to the chain.

    Masave_context
    —

    Async save the context of this chain run to memory.

    Mclear
    —

    Clear memory contents.

    Maclear
    —

    Async clear memory contents.

    Inherited fromSerializable(langchain_core)

    Attributes

    Alc_secretsAlc_attributesAmodel_config

    Methods

    Mis_lc_serializableMget_lc_namespaceMlc_idMto_jsonMto_json_not_implemented

    Parameters

    NameTypeDescription
    human_prefix*unknown

    Prefix for human messages.

    ai_prefix*unknown

    Prefix for AI messages.

    llm*unknown

    Language model.

    memory_key*unknown

    Key to save memory under.

    max_token_limit*unknown

    Maximum number of tokens to keep in the buffer. Once the buffer exceeds this many tokens, the oldest messages will be pruned.

    return_messages*unknown

    Whether to return messages.

    output_key*unknown
    intermediate_steps_key*unknown
    format_as_tools*unknown
    attribute
    human_prefix: str
    attribute
    ai_prefix: str
    attribute
    llm: BaseLanguageModel
    attribute
    memory_key: str
    attribute
    max_token_limit: int

    The max number of tokens to keep in the buffer. Once the buffer exceeds this many tokens, the oldest messages will be pruned.

    attribute
    return_messages: bool
    attribute
    output_key: str
    attribute
    intermediate_steps_key: str
    attribute
    format_as_tools: bool
    attribute
    buffer: list[BaseMessage]

    String buffer of memory.

    attribute
    memory_variables: list[str]

    Always return list of memory variables.

    method
    load_memory_variables

    Return history buffer.

    method
    save_context

    Save context from this conversation to buffer. Pruned.

    Memory used to save agent output AND intermediate steps.

    Key to save output under.

    Key to save intermediate steps under.

    Whether to format as tools.