Message from an AI.
An AIMessage is returned from a chat model as a response to a prompt.
This message represents the output of the model and consists of both the raw output as returned by the model and standardized fields (e.g., tool calls, usage metadata) added by the LangChain framework.
AIMessage(
self,
content: str | list[str | dict] | None = None,
content_blocks: list[types.ContentBlock] | None = None,
**kwargs: Any = {}
)If present, tool calls associated with the message.
If present, tool calls with parsing errors associated with the message.
If present, usage metadata for a message, such as token counts.
This is a standard representation of token usage that is consistent across models.
The type of the message (used for deserialization).
Attributes to be serialized.
Includes all attributes, even if they are derived from other initialization arguments.
Return standard, typed ContentBlock dicts from the message.
If the message has a known model provider, use the provider-specific translator
first before falling back to best-effort parsing. For details, see the property
on BaseMessage.
The contents of the message.
Currently inherited from BaseMessage, but not used.
Currently inherited from BaseMessage, but not used.
The name of the function.
The unique identifier of the node.
Prompt text.
Return True as this class is serializable.
Get the namespace of the LangChain object.
Return a unique identifier for this class for serialization purposes.
Convert the graph to a JSON-serializable format.
Serialize a "not implemented" object.