langchain.js
    Preparing search index...

    Store chat message history using a local JSON file. For demo and development purposes only.

     const model = new ChatOpenAI({
    model: "gpt-3.5-turbo",
    temperature: 0,
    });
    const prompt = ChatPromptTemplate.fromMessages([
    [
    "system",
    "You are a helpful assistant. Answer all questions to the best of your ability.",
    ],
    ["placeholder", "chat_history"],
    ["human", "{input}"],
    ]);

    const chain = prompt.pipe(model).pipe(new StringOutputParser());
    const chainWithHistory = new RunnableWithMessageHistory({
    runnable: chain,
    inputMessagesKey: "input",
    historyMessagesKey: "chat_history",
    getMessageHistory: async (sessionId) => {
    const chatHistory = new FileSystemChatMessageHistory({
    sessionId: sessionId,
    userId: "userId", // Optional
    })
    return chatHistory;
    },
    });
    await chainWithHistory.invoke(
    { input: "What did I just say my name was?" },
    { configurable: { sessionId: "session-id" } }
    );

    Hierarchy (View Summary)

    Index

    Constructors

    Properties

    lc_namespace: string[] = ...

    Methods

    • Parameters

      • message: BaseMessage

      Returns Promise<void>

    • Returns Promise<void>

    • Returns Promise<void>

    • Returns Promise<FileChatSession[]>

    • Returns Promise<Record<string, unknown>>

    • Returns Promise<BaseMessage[]>

    • Returns Promise<FileChatStore>

    • Returns Promise<void>

    • Parameters

      • context: Record<string, unknown>

      Returns Promise<void>