A single text generation output.
Generation represents the response from an "old-fashioned" LLM (string-in, string-out) that generates regular text (not chat messages).
This model is used internally by chat model and will eventually be mapped to a more
general LLMResult object, and then projected into an AIMessage object.
LangChain users working with chat models will usually access information via
AIMessage (returned from runnable interfaces) or LLMResult (available via
callbacks). Please refer to AIMessage and LLMResult for more information.
Generated text output.
Raw response from the provider.
May include things like the reason for finishing or token log probabilities.
Type is used exclusively for serialization purposes.
Set to 'Generation' for this class.