langchain.js
    Preparing search index...

    To use this model you need to have the node-llama-cpp module installed. This can be installed using npm install -S node-llama-cpp and the minimum version supported in version 2.0.0. This also requires that have a locally built version of Llama3 installed.

    Hierarchy (View Summary)

    Index

    Constructors

    • Parameters

      Returns LlamaCpp

    Properties

    _context: LlamaContext
    _gbnf: undefined | LlamaGrammar
    _jsonSchema: undefined | LlamaJsonSchemaGrammar<GbnfJsonSchema>
    _model: LlamaModel
    _session: LlamaChatSession
    lc_serializable: boolean = true
    maxTokens?: number
    temperature?: number
    topK?: number
    topP?: number
    trimWhitespaceSuffix?: boolean

    Methods

    • Returns string

    • Parameters

      • prompt: string
      • _options: unknown
      • OptionalrunManager: any

      Returns AsyncGenerator<GenerationChunk>

    • Initializes the llama_cpp model for usage.

      Parameters

      Returns Promise<LlamaCpp>

      A Promise that resolves to the LlamaCpp type class.

    • Returns string