langchain.js
    Preparing search index...

    Deprecated in favor of the @langchain/ollama package. Import from @langchain/ollama instead.

    A class that enables calls to the Ollama API to access large language models in a chat-like fashion. It extends the SimpleChatModel class and implements the OllamaInput interface.

    const prompt = ChatPromptTemplate.fromMessages([
    [
    "system",
    `You are an expert translator. Format all responses as JSON objects with two keys: "original" and "translated".`,
    ],
    ["human", `Translate "{input}" into {language}.`],
    ]);

    const model = new ChatOllama({
    baseUrl: "http://api.example.com",
    model: "llama2",
    format: "json",
    });

    const chain = prompt.pipe(model);

    const result = await chain.invoke({
    input: "I love programming",
    language: "German",
    });

    Hierarchy (View Summary)

    Implements

    Index

    Constructors

    • Parameters

      • fields: any

      Returns ChatOllama

    Properties

    baseUrl: string = "http://localhost:11434"
    embeddingOnly?: boolean
    f16KV?: boolean
    format?: any
    frequencyPenalty?: number
    headers?: Record<string, string>
    keepAlive: string = "5m"
    lc_serializable: boolean = true
    logitsAll?: boolean
    lowVram?: boolean
    mainGpu?: number
    mirostat?: number
    mirostatEta?: number
    mirostatTau?: number
    model: string = "llama2"
    numBatch?: number
    numCtx?: number
    numGpu?: number
    numGqa?: number
    numKeep?: number
    numPredict?: number
    numThread?: number
    penalizeNewline?: boolean
    presencePenalty?: number
    repeatLastN?: number
    repeatPenalty?: number
    ropeFrequencyBase?: number
    ropeFrequencyScale?: number
    stop?: string[]
    temperature?: number
    tfsZ?: number
    topK?: number
    topP?: number
    typicalP?: number
    useMLock?: boolean
    useMMap?: boolean
    vocabOnly?: boolean

    Methods

    • Returns {}

    • Parameters

      • messages: BaseMessage[]

      Returns OllamaMessage[]

    • Parameters

      • messages: BaseMessage[]

      Returns string

    • Returns string

    • Parameters

      • input: BaseMessage[]
      • options: unknown
      • OptionalrunManager: any

      Returns AsyncGenerator<ChatGenerationChunk>

    • Parameters

      • input: BaseMessage[]
      • options: unknown
      • OptionalrunManager: any

      Returns AsyncGenerator<ChatGenerationChunk>

    • Parameters

      • options: unknown

      Returns LangSmithParams

    • A method that returns the parameters for an Ollama API call. It includes model and options parameters.

      Parameters

      • Optionaloptions: unknown

        Optional parsed call options.

      Returns {
          format: any;
          keep_alive: string;
          model: string;
          options: {
              embedding_only: undefined | boolean;
              f16_kv: undefined | boolean;
              frequency_penalty: undefined | number;
              logits_all: undefined | boolean;
              low_vram: undefined | boolean;
              main_gpu: undefined | number;
              mirostat: undefined | number;
              mirostat_eta: undefined | number;
              mirostat_tau: undefined | number;
              num_batch: undefined | number;
              num_ctx: undefined | number;
              num_gpu: undefined | number;
              num_gqa: undefined | number;
              num_keep: undefined | number;
              num_predict: undefined | number;
              num_thread: undefined | number;
              penalize_newline: undefined | boolean;
              presence_penalty: undefined | number;
              repeat_last_n: undefined | number;
              repeat_penalty: undefined | number;
              rope_frequency_base: undefined | number;
              rope_frequency_scale: undefined | number;
              stop: any;
              temperature: undefined | number;
              tfs_z: undefined | number;
              top_k: undefined | number;
              top_p: undefined | number;
              typical_p: undefined | number;
              use_mlock: undefined | boolean;
              use_mmap: undefined | boolean;
              vocab_only: undefined | boolean;
          };
      }

      An object containing the parameters for an Ollama API call.

    • Returns string