langchain.js
    Preparing search index...

    Wrapper around Tencent Hunyuan large language models that use the Chat endpoint.

    To use you should have the TENCENT_SECRET_ID and TENCENT_SECRET_KEY environment variable set.

    const messages = [new HumanMessage("Hello")];

    const hunyuanLite = new ChatTencentHunyuan({
    model: "hunyuan-lite",
    tencentSecretId: "YOUR-SECRET-ID",
    tencentSecretKey: "YOUR-SECRET-KEY",
    });

    let res = await hunyuanLite.call(messages);

    const hunyuanPro = new ChatTencentHunyuan({
    model: "hunyuan-pro",
    temperature: 1,
    tencentSecretId: "YOUR-SECRET-ID",
    tencentSecretKey: "YOUR-SECRET-KEY",
    });

    res = await hunyuanPro.call(messages);

    Hierarchy

    • ChatTencentHunyuan
      • ChatTencentHunyuan
    Index

    Constructors

    • Parameters

      • Optionalfields: any

      Returns ChatTencentHunyuan

    Properties

    host: string = "hunyuan.tencentcloudapi.com"

    Tencent Cloud API Host.

    "hunyuan.tencentcloudapi.com"
    
    lc_serializable: boolean = true
    model: string = "hunyuan-pro"

    Model name to use.

    "hunyuan-pro"
    
    sign: sign

    Tencent Cloud API v3 sign method.

    streaming: boolean = false

    Whether to stream the results or not. Defaults to false.

    false
    
    temperature?: number

    Amount of randomness injected into the response. Ranges from 0.0 to 2.0. Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks. Defaults to 1.0.95.

    tencentSecretId?: string

    SecretID to use when making requests, can be obtained from https://console.cloud.tencent.com/cam/capi. Defaults to the value of TENCENT_SECRET_ID environment variable.

    tencentSecretKey?: string

    Secret key to use when making requests, can be obtained from https://console.cloud.tencent.com/cam/capi. Defaults to the value of TENCENT_SECRET_KEY environment variable.

    topP?: number

    Total probability mass of tokens to consider at each step. Range from 0 to 1.0. Defaults to 1.0.

    Accessors

    • get callKeys(): string[]

      Returns string[]

    • get lc_aliases(): undefined | { [key: string]: string }

      Returns undefined | { [key: string]: string }

    • get lc_secrets(): undefined | { [key: string]: string }

      Returns undefined | { [key: string]: string }

    Methods

    • Returns string

    • Parameters

      • messages: BaseMessage[]
      • options: unknown
      • OptionalrunManager: any

      Returns AsyncGenerator<ChatGenerationChunk>

    • Get the HTTP headers used to invoke the model

      Parameters

      • request: object
      • timestamp: number

      Returns HeadersInit

    • Get the parameters used to invoke the model

      Returns Omit<ChatCompletionRequest, "Messages">

    • Returns string