langchain.js
    Preparing search index...

    Interface ChatXAIInput

    interface ChatXAIInput {
        apiKey?: string;
        maxTokens?: number;
        model?: string;
        stop?: string[];
        stopSequences?: string[];
        streaming?: boolean;
        temperature?: number;
    }

    Hierarchy (View Summary)

    Index

    Properties

    apiKey?: string

    The xAI API key to use for requests.

    process.env.XAI_API_KEY
    
    maxTokens?: number

    The maximum number of tokens that the model can process in a single response. This limits ensures computational efficiency and resource management.

    model?: string

    The name of the model to use.

    "grok-beta"
    
    stop?: string[]

    Up to 4 sequences where the API will stop generating further tokens. The returned text will not contain the stop sequence. Alias for stopSequences

    stopSequences?: string[]

    Up to 4 sequences where the API will stop generating further tokens. The returned text will not contain the stop sequence.

    streaming?: boolean

    Whether or not to stream responses.

    temperature?: number

    The temperature to use for sampling.

    0.7