LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Caches
    • Callbacks
    • Documents
    • Document loaders
    • Embeddings
    • Exceptions
    • Language models
    • Serialization
    • Output parsers
    • Prompts
    • Rate limiters
    • Retrievers
    • Runnables
    • Utilities
    • Vector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewCachesCallbacksDocumentsDocument loadersEmbeddingsExceptionsLanguage modelsSerializationOutput parsersPromptsRate limitersRetrieversRunnablesUtilitiesVector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-coretracersbase
    Module●Since v0.1

    base

    Base interfaces for tracing runs.

    Attributes

    attribute
    Run: RunTree
    attribute
    logger

    Classes

    class
    AsyncCallbackHandler

    Base async callback handler.

    class
    BaseCallbackHandler

    Base callback handler.

    class
    TracerException

    Base class for exceptions in tracers module.

    class
    Document

    Class for storing a piece of text and associated metadata.

    Note

    Document is for retrieval workflows, not chat I/O. For sending text to an LLM in a conversation, use message types from langchain.messages.

    class
    BaseMessage

    Base abstract message class.

    Messages are the inputs and outputs of a chat model.

    Examples include HumanMessage, AIMessage, and SystemMessage.

    class
    ChatGenerationChunk

    ChatGeneration chunk.

    ChatGeneration chunks can be concatenated with other ChatGeneration chunks.

    class
    GenerationChunk

    GenerationChunk, which can be concatenated with other Generation chunks.

    class
    LLMResult

    A container for results of an LLM call.

    Both chat models and LLMs generate an LLMResult object. This object contains the generated outputs and any additional information that the model provider wants to return.

    class
    BaseTracer

    Base interface for tracers.

    class
    AsyncBaseTracer

    Async base interface for tracers.

    View source on GitHub