LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
LangChain
  • Universal
  • Hub
  • Node
  • Load
  • Serializable
  • Encoder Backed
  • File System
  • In Memory
LangChain Core
  • Agents
  • Caches
  • Base
  • Dispatch
  • Web
  • Manager
  • Promises
  • Chat History
  • Context
  • Base
  • Langsmith
  • Documents
  • Embeddings
  • Errors
  • Example Selectors
  • Indexing
  • Base
  • Chat Models
  • Llms
  • Profile
  • Load
  • Serializable
  • Memory
  • Messages
  • Tool
  • Output Parsers
  • Openai Functions
  • Openai Tools
  • Outputs
  • Prompt Values
  • Prompts
  • Retrievers
  • Document Compressors
  • Runnables
  • Graph
  • Singletons
  • Stores
  • Structured Query
  • Tools
  • Base
  • Console
  • Log Stream
  • Run Collector
  • Tracer Langchain
  • Stream
  • Async Caller
  • Chunk Array
  • Context
  • Env
  • Event Source Parse
  • Format
  • Function Calling
  • Hash
  • Json Patch
  • Json Schema
  • Math
  • Ssrf
  • Stream
  • Testing
  • Tiktoken
  • Types
  • Vectorstores
Text Splitters
MCP Adapters
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

LangChain
UniversalHubNodeLoadSerializableEncoder BackedFile SystemIn Memory
LangChain Core
AgentsCachesBaseDispatchWebManagerPromisesChat HistoryContextBaseLangsmithDocumentsEmbeddingsErrorsExample SelectorsIndexingBaseChat ModelsLlmsProfileLoadSerializableMemoryMessagesToolOutput ParsersOpenai FunctionsOpenai ToolsOutputsPrompt ValuesPromptsRetrieversDocument CompressorsRunnablesGraphSingletonsStoresStructured QueryToolsBaseConsoleLog StreamRun CollectorTracer LangchainStreamAsync CallerChunk ArrayContextEnvEvent Source ParseFormatFunction CallingHashJson PatchJson SchemaMathSsrfStreamTestingTiktokenTypesVectorstores
Text Splitters
MCP Adapters
Language
Theme
JavaScript@langchain/corecallbacksbaseBaseCallbackHandler
Class●Since v1.0

BaseCallbackHandler

Abstract base class for creating callback handlers in the LangChain framework. It provides a set of optional methods that can be overridden in derived classes to handle various events during the execution of a LangChain application.

Copy
class BaseCallbackHandler

Bases

BaseCallbackHandlerMethodsClass

Constructors

constructor
constructor

Properties

property
awaitHandlers: boolean
property
ignoreAgent: boolean
property
ignoreChain: boolean
property
ignoreCustomEvent: boolean
property
ignoreLLM: boolean
property
ignoreRetriever: boolean
property
lc_kwargs: SerializedFields
property
lc_serializable: boolean
property
name: string
property
raiseError: boolean
property
lc_aliases: __type | undefined
property
lc_attributes: __type | undefined
property
lc_id: string[]
property
lc_namespace: ["langchain_core", "callbacks", string]

A path to the module that contains the class, eg. ["langchain", "llms"] Usually should be the same as the entrypoint the class is exported from.

property
lc_secrets: __type | undefined
property
lc_serializable_keys: string[] | undefined

Methods

method
copy→ BaseCallbackHandler
method
handleAgentAction→ void | Promise<void>

Called when an agent is about to execute an action, with the action and the run ID.

method
handleAgentEnd→ void | Promise<void>

Called when an agent finishes execution, before it exits. with the final output and the run ID.

method
handleChainEnd→ any

Called at the end of a Chain run, with the outputs and the run ID.

method
handleChainError→ any

Called if a Chain run encounters an error

method
handleChainStart→ any

Called at the start of a Chain run, with the chain name and inputs and the run ID.

method
handleChatModelStart→ any

Called at the start of a Chat Model run, with the prompt(s) and the run ID.

method
handleCustomEvent→ any
method
handleLLMEnd→ any

Called at the end of an LLM/ChatModel run, with the output and the run ID.

method
handleLLMError→ any

Called if an LLM/ChatModel run encounters an error

method
handleLLMNewToken→ any

Called when an LLM/ChatModel in streaming mode produces a new token

method
handleLLMStart→ any

Called at the start of an LLM or Chat Model run, with the prompt(s) and the run ID.

method
handleRetrieverEnd→ any
method
handleRetrieverError→ any
method
handleRetrieverStart→ any
method
handleText→ void | Promise<void>
method
handleToolEnd→ any

Called at the end of a Tool run, with the tool output and the run ID.

method
handleToolError→ any

Called if a Tool run encounters an error

method
handleToolStart→ any

Called at the start of a Tool run, with the tool name and input and the run ID.

method
toJSON→ Serialized
method
toJSONNotImplemented→ SerializedNotImplemented
method
fromMethods→ Handler
method
lc_name→ string

The name of the serializable. Override to provide an alias or to preserve the serialized module name in minified environments.

Implemented as a static method to support loading logic.

View source on GitHub