LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Caches
    • Callbacks
    • Documents
    • Document loaders
    • Embeddings
    • Exceptions
    • Language models
    • Serialization
    • Output parsers
    • Prompts
    • Rate limiters
    • Retrievers
    • Runnables
    • Utilities
    • Vector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewCachesCallbacksDocumentsDocument loadersEmbeddingsExceptionsLanguage modelsSerializationOutput parsersPromptsRate limitersRetrieversRunnablesUtilitiesVector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-coretracerslangchainLangChainTracer
    Class●Since v0.1

    LangChainTracer

    Implementation of the SharedTracer that POSTS to the LangChain endpoint.

    Copy
    LangChainTracer(
      self,
      example_id: UUID | str | None = None,
      project_name: str | None = None,
      client: Client | None = None,
      tags: list[str] | None = None,
      **kwargs: Any = {}
    )

    Bases

    BaseTracer

    Used in Docs

    • LangSmith Observability

    Parameters

    NameTypeDescription
    example_idUUID | str | None
    Default:None

    The example ID.

    project_namestr | None
    Default:None

    The project name.

    Defaults to the tracer project.

    clientClient | None
    Default:None

    The client.

    Defaults to the global client.

    tagslist[str] | None
    Default:None

    The tags.

    Defaults to an empty list.

    **kwargsAny
    Default:{}

    Additional keyword arguments.

    Constructors

    constructor
    __init__
    NameType
    example_idUUID | str | None
    project_namestr | None
    clientClient | None
    tagslist[str] | None

    Attributes

    attribute
    run_inline: bool
    attribute
    example_id
    attribute
    project_name
    attribute
    client
    attribute
    tags
    attribute
    latest_run: Run | None
    attribute
    run_has_token_event_map: dict[str, bool]

    Methods

    method
    on_chat_model_start

    Start a trace for an LLM run.

    method
    get_run_url

    Get the LangSmith root run URL.

    method
    wait_for_futures

    Wait for the given futures to complete.

    Inherited fromBaseTracer

    Methods

    Mon_llm_start
    —

    Start a trace for a (non-chat model) LLM run.

    Mon_llm_new_token
    —

    Run on new output token.

    Mon_retry
    —

    Run on retry.

    Mon_llm_end
    —

    End a trace for a model run.

    Mon_llm_error
    —

    Handle an error for an LLM run.

    Mon_chain_start
    —

    Start a trace for a chain run.

    Mon_chain_end
    —

    End a trace for a chain run.

    Mon_chain_error
    —

    Handle an error for a chain run.

    Mon_tool_start
    —

    Start a trace for a tool run.

    Mon_tool_end
    —

    End a trace for a tool run.

    Mon_tool_error
    —

    Run when tool errors.

    Mon_retriever_start
    —

    Run when Retriever starts running.

    Mon_retriever_error
    —

    Run when Retriever errors.

    Mon_retriever_end
    —

    Run when Retriever ends running.

    Inherited fromBaseCallbackHandler

    Attributes

    Araise_error: bool
    —

    Whether to raise an error if an exception occurs.

    Aignore_llm: bool
    —

    Whether to ignore LLM callbacks.

    Aignore_retry: bool
    —

    Whether to ignore retry callbacks.

    Aignore_chain: bool
    —

    Whether to ignore chain callbacks.

    Aignore_agent: bool
    —

    Whether to ignore agent callbacks.

    Aignore_retriever: bool
    —

    Whether to ignore retriever callbacks.

    Aignore_chat_model: bool
    —

    Whether to ignore chat model callbacks.

    Aignore_custom_event: bool
    —

    Ignore custom event.

    Inherited fromLLMManagerMixin

    Methods

    Mon_llm_new_token
    —

    Run on new output token.

    Mon_llm_end
    —

    End a trace for a model run.

    Mon_llm_error
    —

    Handle an error for an LLM run.

    Inherited fromChainManagerMixin

    Methods

    Mon_chain_end
    —

    End a trace for a chain run.

    Mon_chain_error
    —

    Handle an error for a chain run.

    Mon_agent_action
    —

    Run on agent action.

    Mon_agent_finish
    —

    Run on the agent end.

    Inherited fromToolManagerMixin

    Methods

    Mon_tool_end
    —

    End a trace for a tool run.

    Mon_tool_error
    —

    Run when tool errors.

    Inherited fromRetrieverManagerMixin

    Methods

    Mon_retriever_error
    —

    Run when Retriever errors.

    Mon_retriever_end
    —

    Run when Retriever ends running.

    Inherited fromCallbackManagerMixin

    Methods

    Mon_llm_start
    —

    Start a trace for a (non-chat model) LLM run.

    Mon_retriever_start
    —

    Run when Retriever starts running.

    Mon_chain_start
    —

    Start a trace for a chain run.

    Mon_tool_start
    —

    Start a trace for a tool run.

    Inherited fromRunManagerMixin

    Methods

    Mon_text
    —

    Run on an arbitrary text.

    Mon_retry
    —

    Run on retry.

    Mon_custom_event
    —

    Generate a custom astream event.

    View source on GitHub