langchain.js
    Preparing search index...

    Groq chat model integration.

    The Groq API is compatible to the OpenAI API with some limitations. View the full API ref at:

    Hierarchy (View Summary)

    Index

    Constructors

    Properties

    apiKey?: string
    client: Groq
    frequencyPenalty: undefined | null | number
    lc_namespace: string[] = ...
    lc_serializable: boolean = true
    logitBias: undefined | null | Record<string, number>
    logprobs: undefined | null | boolean
    maxTokens?: number
    model: string
    n: undefined | null | number
    presencePenalty: undefined | null | number
    reasoningFormat: undefined | null | "hidden" | "raw" | "parsed"
    serviceTier: undefined | null | "auto" | "on_demand" | "flex"
    stop?: string[]
    stopSequences?: string[]
    streaming: boolean = false
    streamUsage: boolean = true
    temperature: number = 0.7
    topLogprobs: undefined | null | number
    topP: undefined | null | number
    user: undefined | null | string

    Accessors

    • get callKeys(): any[]

      Returns any[]

    • get lc_secrets(): undefined | { [key: string]: string }

      Returns undefined | { [key: string]: string }

    • get lc_serialized_keys(): string[]

      Returns string[]

    • get profile(): ModelProfile

      Return profiling information for the model.

      Provides information about the model's capabilities and constraints, including token limits, multimodal support, and advanced features like tool calling and structured output.

      Returns ModelProfile

      An object describing the model's capabilities and constraints

      const model = new ChatGroq({ model: "llama-3.1-8b-instant" });
      const profile = model.profile;
      console.log(profile.maxInputTokens); // 128000
      console.log(profile.imageInputs); // true

    Methods

    • Parameters

      • messages: BaseMessage[]
      • options: unknown
      • OptionalrunManager: any

      Returns Promise<ChatResult>

    • Parameters

      • messages: BaseMessage[]
      • options: unknown
      • Optional_runManager: any

      Returns Promise<ChatResult>

    • Returns string

    • Parameters

      • messages: BaseMessage[]
      • options: unknown
      • OptionalrunManager: any

      Returns AsyncGenerator<ChatGenerationChunk>

    • Parameters

      • tools: any[]
      • Optionalkwargs: any

      Returns Runnable<BaseLanguageModelInput, AIMessageChunk, any>

    • Parameters

      • request: ChatCompletionCreateParamsStreaming
      • Optionaloptions: RequestOptions

      Returns Promise<AsyncIterable<ChatCompletionChunk, any, any>>

    • Parameters

      • request: ChatCompletionCreateParamsNonStreaming
      • Optionaloptions: RequestOptions

      Returns Promise<ChatCompletion>

    • Parameters

      • options: unknown

      Returns LangSmithParams

    • Parameters

      • options: unknown
      • Optionalextra: { streaming?: boolean }

      Returns Omit<ChatCompletionCreateParams, "messages">

    • Type Parameters

      • RunOutput extends Record<string, any> = Record<string, any>

      Parameters

      • outputSchema: any
      • Optionalconfig: any

      Returns Runnable<BaseLanguageModelInput, RunOutput>

    • Type Parameters

      • RunOutput extends Record<string, any> = Record<string, any>

      Parameters

      • outputSchema: any
      • Optionalconfig: any

      Returns Runnable<BaseLanguageModelInput, { parsed: RunOutput; raw: BaseMessage }>

    • Returns string