langchain.js
    Preparing search index...

    To use this model you need to have the @mlc-ai/web-llm module installed. This can be installed using npm install -S @mlc-ai/web-llm.

    You can see a list of available model records here: https://github.com/mlc-ai/web-llm/blob/main/src/config.ts

    // Initialize the ChatWebLLM model with the model record.
    const model = new ChatWebLLM({
    model: "Phi-3-mini-4k-instruct-q4f16_1-MLC",
    chatOptions: {
    temperature: 0.5,
    },
    });

    // Call the model with a message and await the response.
    const response = await model.invoke([
    new HumanMessage({ content: "My name is John." }),
    ]);

    Hierarchy (View Summary)

    Index

    Constructors

    • Parameters

      Returns ChatWebLLM

    Properties

    appConfig?: AppConfig
    chatOptions?: ChatOptions
    engine: MLCEngine
    model: string
    temperature?: number
    inputs: WebLLMInputs

    Methods

    • Parameters

      • messages: BaseMessage[]
      • options: unknown
      • OptionalrunManager: any

      Returns Promise<string>

    • Returns string

    • Parameters

      • messages: BaseMessage[]
      • options: unknown
      • OptionalrunManager: any

      Returns AsyncGenerator<ChatGenerationChunk>

    • Parameters

      • OptionalprogressCallback: InitProgressCallback

      Returns Promise<void>

    • Parameters

      • modelId: string
      • OptionalnewChatOpts: ChatOptions

      Returns Promise<void>

    • Returns string