LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicmemorytoken_bufferConversationTokenBufferMemory
    Class●Since v1.0Deprecated

    ConversationTokenBufferMemory

    Copy
    ConversationTokenBufferMemory()

    Bases

    BaseChatMemory

    Attributes

    Methods

    Inherited fromBaseChatMemory

    Attributes

    Achat_memory: BaseChatMessageHistoryAoutput_key: strAinput_key: str | None
    —

    The key from the model Run's inputs to use as the eval input.

    View source on GitHub
    A
    return_messages
    : bool

    Methods

    Masave_context
    —

    Async save the context of this chain run to memory.

    Mclear
    —

    Clear memory contents.

    Maclear
    —

    Async clear memory contents.

    Inherited fromBaseMemory

    Attributes

    Amodel_config

    Methods

    Maload_memory_variables
    —

    Async return key-value pairs given the text input to the chain.

    Masave_context
    —

    Async save the context of this chain run to memory.

    Mclear
    —

    Clear memory contents.

    Maclear
    —

    Async clear memory contents.

    Inherited fromSerializable(langchain_core)

    Attributes

    Alc_secretsAlc_attributesAmodel_config

    Methods

    Mis_lc_serializableMget_lc_namespaceMlc_idMto_jsonMto_json_not_implemented
    attribute
    human_prefix: str
    attribute
    ai_prefix: str
    attribute
    llm: BaseLanguageModel
    attribute
    memory_key: str
    attribute
    max_token_limit: int
    attribute
    buffer: Any

    String buffer of memory.

    attribute
    buffer_as_str: str

    Exposes the buffer as a string in case return_messages is False.

    attribute
    buffer_as_messages: list[BaseMessage]

    Exposes the buffer as a list of messages in case return_messages is True.

    attribute
    memory_variables: list[str]

    Will always return list of memory variables.

    method
    load_memory_variables

    Return history buffer.

    method
    save_context

    Save context from this conversation to buffer. Pruned.

    Conversation chat memory with token limit.

    Keeps only the most recent messages in the conversation under the constraint that the total number of tokens in the conversation does not exceed a certain limit.