langchain.js
    Preparing search index...

    Class ConversationSummaryBufferMemory

    Class that extends BaseConversationSummaryMemory and implements ConversationSummaryBufferMemoryInput. It manages the conversation history in a LangChain application by maintaining a buffer of chat messages and providing methods to load, save, prune, and clear the memory.

    // Initialize the memory with a specific model and token limit
    const memory = new ConversationSummaryBufferMemory({
    llm: new ChatOpenAI({ model: "gpt-3.5-turbo-instruct", temperature: 0 }),
    maxTokenLimit: 10,
    });

    // Save conversation context to memory
    await memory.saveContext({ input: "hi" }, { output: "whats up" });
    await memory.saveContext({ input: "not much you" }, { output: "not much" });

    // Load the conversation history from memory
    const history = await memory.loadMemoryVariables({});
    console.log({ history });

    // Create a chat prompt using the conversation history
    const chatPrompt = ChatPromptTemplate.fromMessages([
    SystemMessagePromptTemplate.fromTemplate(
    "The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.",
    ),
    new MessagesPlaceholder("history"),
    HumanMessagePromptTemplate.fromTemplate("{input}"),
    ]);

    // Initialize the conversation chain with the model, memory, and prompt
    const chain = new ConversationChain({
    llm: new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0.9, verbose: true }),
    memory: memory,
    prompt: chatPrompt,
    });

    Hierarchy (View Summary)

    Implements

    Index

    Constructors

    Properties

    aiPrefix: string = "AI"
    chatHistory: BaseChatMessageHistory
    humanPrefix: string = "Human"
    inputKey?: string
    llm: BaseLanguageModelInterface
    maxTokenLimit: number = 2000
    memoryKey: string = "history"
    movingSummaryBuffer: string = ""
    outputKey?: string
    prompt: BasePromptTemplate = SUMMARY_PROMPT
    returnMessages: boolean = false
    summaryChatMessageClass: new (content: string) => BaseMessage = SystemMessage

    Accessors

    • get memoryKeys(): string[]

      Returns string[]

    Methods

    • Method that clears the memory and resets the movingSummaryBuffer.

      Returns Promise<void>

      Promise that resolves when the memory is cleared.

    • Method that loads the chat messages from the memory and returns them as a string or as a list of messages, depending on the returnMessages property.

      Parameters

      • Optional_: any

        InputValues object, not used in this method.

      Returns Promise<MemoryVariables>

      Promise that resolves with MemoryVariables object containing the loaded chat messages.

    • Predicts a new summary for the conversation given the existing messages and summary.

      Parameters

      • messages: BaseMessage[]

        Existing messages in the conversation.

      • existingSummary: string

        Current summary of the conversation.

      Returns Promise<string>

      A promise that resolves to a new summary string.

    • Method that prunes the memory if the total number of tokens in the buffer exceeds the maxTokenLimit. It removes messages from the beginning of the buffer until the total number of tokens is within the limit.

      Returns Promise<void>

      Promise that resolves when the memory is pruned.

    • Method that saves the context of the conversation, including the input and output values, and prunes the memory if it exceeds the maximum token limit.

      Parameters

      • inputValues: InputValues

        InputValues object containing the input values of the conversation.

      • outputValues: OutputValues

        OutputValues object containing the output values of the conversation.

      Returns Promise<void>

      Promise that resolves when the context is saved and the memory is pruned.