A container for results of an LLM call.
Both chat models and LLMs generate an LLMResult object. This object contains the
generated outputs and any additional information that the model provider wants to
return.
LLMResult()Generated outputs.
The first dimension of the list represents completions for different input prompts.
The second dimension of the list represents different candidate generations for a given prompt.
list[list[Generation]].list[list[ChatGeneration]].ChatGeneration is a subclass of Generation that has a field for a structured
chat message.
For arbitrary LLM provider specific output.
This dictionary is a free-form dictionary that can contain any information that the provider wants to return. It is not standardized and is provider-specific.
Users should generally avoid relying on this field and instead rely on accessing relevant information from standardized fields present in AIMessage.
List of metadata info for model call for each input.
See langchain_core.outputs.run_info.RunInfo for details.
Type is used exclusively for serialization purposes.