langchain.js
    Preparing search index...

    To use this model you need to have the node-llama-cpp module installed. This can be installed using npm install -S node-llama-cpp and the minimum version supported in version 2.0.0. This also requires that have a locally built version of Llama3 installed.

    // Initialize the ChatLlamaCpp model with the path to the model binary file.
    const model = await ChatLlamaCpp.initialize({
    modelPath: "/Replace/with/path/to/your/model/gguf-llama3-Q4_0.bin",
    temperature: 0.5,
    });

    // Call the model with a message and await the response.
    const response = await model.invoke([
    new HumanMessage({ content: "My name is John." }),
    ]);

    // Log the response to the console.
    console.log({ response });

    Hierarchy (View Summary)

    Index

    Constructors

    • Parameters

      Returns ChatLlamaCpp

    Properties

    _context: LlamaContext
    _model: LlamaModel
    _session: null | LlamaChatSession
    lc_serializable: boolean = true
    maxTokens?: number
    temperature?: number
    topK?: number
    topP?: number
    trimWhitespaceSuffix?: boolean

    Methods

    • Parameters

      • input: BaseMessage[]

      Returns string

    • Parameters

      • messages: BaseMessage[]

      Returns string

    • Parameters

      • messages: BaseMessage[]

      Returns ChatHistoryItem[]

    • Returns string

    • Parameters

      • input: BaseMessage[]
      • _options: unknown
      • OptionalrunManager: any

      Returns AsyncGenerator<ChatGenerationChunk>

    • Returns {
          maxTokens: undefined | number;
          temperature: undefined | number;
          topK: undefined | number;
          topP: undefined | number;
          trimWhitespaceSuffix: undefined | boolean;
      }

    • Initializes the llama_cpp model for usage in the chat models wrapper.

      Parameters

      • inputs: LlamaBaseCppInputs

        the inputs passed onto the model.

      Returns Promise<ChatLlamaCpp>

      A Promise that resolves to the ChatLlamaCpp type class.

    • Returns string