langchain.js
    Preparing search index...

    Memory used to save agent output and intermediate steps.

    Hierarchy (View Summary)

    Index

    Constructors

    Properties

    aiPrefix: string = "AI"
    chatHistory: BaseChatMessageHistory
    humanPrefix: string = "Human"
    inputKey?: string
    intermediateStepsKey: string = "intermediateSteps"
    llm: ChatOpenAI
    maxTokenLimit: number = 12000
    memoryKey: string = "history"
    outputKey: string = "output"
    returnMessages: boolean = true

    Accessors

    • get memoryKeys(): string[]

      Returns string[]

    Methods

    • Method to clear the chat history.

      Returns Promise<void>

      Promise that resolves when the chat history has been cleared.

    • Retrieves the messages from the chat history.

      Returns Promise<any>

      Promise that resolves with the messages from the chat history.

    • Loads memory variables from the input values.

      Parameters

      • _values: InputValues

        Input values.

      Returns Promise<MemoryVariables>

      Promise that resolves with the loaded memory variables.

    • Saves the context of the chat, including user input, AI output, and intermediate steps. Prunes the chat history if the total token count exceeds the maximum limit.

      Parameters

      • inputValues: InputValues

        Input values.

      • outputValues: OutputValues

        Output values.

      Returns Promise<void>

      Promise that resolves when the context has been saved.