langchain.js
    Preparing search index...

    Class ConversationTokenBufferMemory

    Class that represents a conversation chat memory with a token buffer. It extends the BaseChatMemory class and implements the ConversationTokenBufferMemoryInput interface.

    const memory = new ConversationTokenBufferMemory({
    llm: new ChatOpenAI({ model: "gpt-4o-mini" }),
    maxTokenLimit: 10,
    });

    // Save conversation context
    await memory.saveContext({ input: "hi" }, { output: "whats up" });
    await memory.saveContext({ input: "not much you" }, { output: "not much" });

    // Load memory variables
    const result = await memory.loadMemoryVariables({});
    console.log(result);

    Hierarchy (View Summary)

    Implements

    Index

    Constructors

    Properties

    aiPrefix: string = "AI"
    chatHistory: BaseChatMessageHistory
    humanPrefix: string = "Human"
    inputKey?: string
    llm: BaseLanguageModelInterface
    maxTokenLimit: number = 2000
    memoryKey: string = "history"
    outputKey?: string
    returnMessages: boolean = false

    Accessors

    • get memoryKeys(): string[]

      Returns string[]

    Methods

    • Method to clear the chat history.

      Returns Promise<void>

      Promise that resolves when the chat history has been cleared.

    • Loads the memory variables. It takes an InputValues object as a parameter and returns a Promise that resolves with a MemoryVariables object.

      Parameters

      • _values: InputValues

        InputValues object.

      Returns Promise<MemoryVariables>

      A Promise that resolves with a MemoryVariables object.

    • Saves the context from this conversation to buffer. If the amount of tokens required to save the buffer exceeds MAX_TOKEN_LIMIT, prune it.

      Parameters

      • inputValues: InputValues
      • outputValues: OutputValues

      Returns Promise<void>