langchain.js
    Preparing search index...

    Class used to manage the memory of a chat session, including loading and saving the chat history, and clearing the memory when needed. It uses the ZepClient to interact with the Zep service for managing the chat session's memory.

    The class provides options for handling different LLM requirements:

    • Use separateMessages=true (default) for models that fully support system messages
    • Use separateMessages=false for models like Claude that have limitations with system messages
    const sessionId = randomUUID();
    const zepURL = "http://your-zep-url";

    // Initialize ZepMemory with session ID, base URL, and API key
    const memory = new ZepMemory({
    sessionId,
    baseURL: zepURL,
    apiKey: "change_this_key",
    // Set to false for models like Claude that have limitations with system messages
    // Defaults to true for backward compatibility
    separateMessages: false,
    });

    // Create a ChatOpenAI model instance with specific parameters
    const model = new ChatOpenAI({
    model: "gpt-3.5-turbo",
    temperature: 0,
    });

    // Create a ConversationChain with the model and memory
    const chain = new ConversationChain({ llm: model, memory });

    // Example of calling the chain with an input
    const res1 = await chain.call({ input: "Hi! I'm Jim." });
    console.log({ res1 });

    // Follow-up call to the chain to demonstrate memory usage
    const res2 = await chain.call({ input: "What did I just say my name was?" });
    console.log({ res2 });

    // Output the session ID and the current state of memory
    console.log("Session ID: ", sessionId);
    console.log("Memory: ", await memory.loadMemoryVariables({}));

    Hierarchy

    • BaseChatMemory
      • ZepMemory

    Implements

    Index

    Constructors

    Properties

    aiPrefix: string = "AI"
    baseURL: string
    humanPrefix: string = "Human"
    memoryKey: string = "history"
    separateMessages: boolean

    Whether to return separate messages for chat history with a SystemMessage containing facts and summary, or return a single HumanMessage with the entire memory context. Defaults to true (preserving message types) for backward compatibility.

    Keep as true for models that fully support system messages. Set to false for models like Claude that have limitations with system messages.

    sessionId: string
    zepClientPromise: Promise<ZepClient>

    Accessors

    • get memoryKeys(): string[]

      Returns string[]

    Methods

    • Method that deletes the chat history from the Zep service.

      Returns Promise<void>

      Promise that resolves when the chat history has been deleted.

    • Method that retrieves the chat history from the Zep service and formats it into a list of messages.

      Parameters

      • values: InputValues

        Input values for the method.

      Returns Promise<MemoryVariables>

      Promise that resolves with the chat history formatted into a list of messages.

    • Method that saves the input and output messages to the Zep service.

      Parameters

      • inputValues: InputValues

        Input messages to be saved.

      • outputValues: OutputValues

        Output messages to be saved.

      Returns Promise<void>

      Promise that resolves when the messages have been saved.