OptionalalibabaAPI key to use when making requests. Defaults to the value of
ALIBABA_API_KEY environment variable.
OptionalenableOptionalmaxModel name to use. Available options are: qwen-turbo, qwen-plus, qwen-max, or Other compatible models.
Model name to use. Available options are: qwen-turbo, qwen-plus, qwen-max, or Other compatible models.
Alias for model
OptionalprefixMessages to pass as a prefix to the prompt
OptionalrepetitionPenalizes repeated tokens according to frequency. Range from 1.0 to 2.0. Defaults to 1.0.
OptionalseedWhether to stream the results or not. Defaults to false.
OptionaltemperatureAmount of randomness injected into the response. Ranges from 0 to 1 (0 is not included). Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks. Defaults to 0.95.
OptionaltopOptionaltopTotal probability mass of tokens to consider at each step. Range from 0 to 1.0. Defaults to 0.8.
Get the identifying parameters for the model
Get the parameters used to invoke the model
Staticlc_
Wrapper around Ali Tongyi large language models that use the Chat endpoint.
To use you should have the
ALIBABA_API_KEYenvironment variable set.Example