Ask a question to get started
Enter to send•Shift+Enter new line
ChatMistralAI()
BaseChatModel
Decode using nucleus sampling: consider the smallest set of tokens whose probability sum is at least top_p. Must be in the closed interval [0.0, 1.0].
top_p
[0.0, 1.0]
Holds any invocation parameters not explicitly specified.
Build extra kwargs from additional params that were passed in.
Use tenacity to retry the completion call.
Validate api key, python package exists, temperature, and top_p.
Bind tool-like objects to this chat model.
Assumes model is compatible with OpenAI tool-calling API.
Model wrapper that returns outputs formatted to match the given schema.
Return whether this model can be serialized by LangChain.
Get the namespace of the LangChain object.
A chat model that uses the Mistral AI API.