Callback Handler that tracks AIMessage.usage_metadata.
Base callback handler.
Message from an AI.
An AIMessage is returned from a chat model as a response to a prompt.
This message represents the output of the model and consists of both the raw output as returned by the model and standardized fields (e.g., tool calls, usage metadata) added by the LangChain framework.
Usage metadata for a message, such as token counts.
This is a standard representation of token usage that is consistent across models.
A single chat generation output.
A subclass of Generation that represents the response from a chat model that
generates chat messages.
The message attribute is a structured representation of the chat message. Most of
the time, the message will be of type AIMessage.
Users working with chat models will usually access information via either
AIMessage (returned from runnable interfaces) or LLMResult (available via
callbacks).
A container for results of an LLM call.
Both chat models and LLMs generate an LLMResult object. This object contains the
generated outputs and any additional information that the model provider wants to
return.
Callback Handler that tracks AIMessage.usage_metadata.