LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Caches
    • Callbacks
    • Documents
    • Document loaders
    • Embeddings
    • Exceptions
    • Language models
    • Serialization
    • Output parsers
    • Prompts
    • Rate limiters
    • Retrievers
    • Runnables
    • Utilities
    • Vector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewCachesCallbacksDocumentsDocument loadersEmbeddingsExceptionsLanguage modelsSerializationOutput parsersPromptsRate limitersRetrieversRunnablesUtilitiesVector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-coreagentsAgentActionMessageLogmessage_log
    Attribute●Since v0.1

    message_log

    Similar to log, this can be used to pass along extra information about what exact messages were predicted by the LLM before parsing out the (tool, tool_input).

    This is again useful if (tool, tool_input) cannot be used to fully recreate the LLM prediction, and you need that LLM prediction (for future agent iteration).

    Compared to log, this is useful when the underlying LLM is a chat model (and therefore returns messages rather than a string).

    Copy
    message_log: Sequence[BaseMessage]
    View source on GitHub