langchain.js
    Preparing search index...

    Interface FakeListChatModelCallOptions

    Represents the call options for a base chat model.

    interface FakeListChatModelCallOptions {
        callbacks?: Callbacks;
        configurable?: Record<string, any>;
        ls_structured_output_format?: {
            kwargs: { method: string };
            schema?: JsonSchema7Type;
        };
        maxConcurrency?: number;
        metadata?: Record<string, unknown>;
        recursionLimit?: number;
        runId?: string;
        runName?: string;
        signal?: AbortSignal;
        stop?: string[];
        tags?: string[];
        thrownErrorString?: string;
        timeout?: number;
        tool_choice?: ToolChoice;
    }

    Hierarchy (View Summary)

    Index

    Properties

    callbacks?: Callbacks

    Callbacks for this call and any sub-calls (eg. a Chain calling an LLM). Tags are passed to all callbacks, metadata is passed to handle*Start callbacks.

    configurable?: Record<string, any>

    Runtime values for attributes previously made configurable on this Runnable, or sub-Runnables.

    ls_structured_output_format?: {
        kwargs: { method: string };
        schema?: JsonSchema7Type;
    }

    Describes the format of structured outputs. This should be provided if an output is considered to be structured

    Type Declaration

    • kwargs: { method: string }

      An object containing the method used for structured output (e.g., "jsonMode").

    • Optionalschema?: JsonSchema7Type

      The JSON schema describing the expected output structure.

    maxConcurrency?: number

    Maximum number of parallel calls to make.

    metadata?: Record<string, unknown>

    Metadata for this call and any sub-calls (eg. a Chain calling an LLM). Keys should be strings, values should be JSON-serializable.

    recursionLimit?: number

    Maximum number of times a call can recurse. If not provided, defaults to 25.

    runId?: string

    Unique identifier for the tracer run for this call. If not provided, a new UUID will be generated.

    runName?: string

    Name for the tracer run for this call. Defaults to the name of the class.

    signal?: AbortSignal

    Abort signal for this call. If provided, the call will be aborted when the signal is aborted.

    stop?: string[]

    Stop tokens to use for this call. If not provided, the default stop tokens for the model will be used.

    tags?: string[]

    Tags for this call and any sub-calls (eg. a Chain calling an LLM). You can use these to filter calls.

    thrownErrorString?: string
    timeout?: number

    Timeout for this call in milliseconds.

    tool_choice?: ToolChoice

    Specifies how the chat model should use tools.

    undefined

    Possible values:
    - "auto": The model may choose to use any of the provided tools, or none.
    - "any": The model must use one of the provided tools.
    - "none": The model must not use any tools.
    - A string (not "auto", "any", or "none"): The name of a specific tool the model must use.
    - An object: A custom schema specifying tool choice parameters. Specific to the provider.

    Note: Not all providers support tool_choice. An error will be thrown
    if used with an unsupported model.