LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • Graphs
  • Functional API
  • Pregel
  • Checkpointing
  • Storage
  • Caching
  • Types
  • Runtime
  • Config
  • Errors
  • Constants
  • Channels
  • Agents
LangGraph CLI
LangGraph SDK
LangGraph Supervisor
LangGraph Swarm
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

OverviewGraphsFunctional APIPregelCheckpointingStorageCachingTypesRuntimeConfigErrorsConstantsChannelsAgents
LangGraph CLI
LangGraph SDK
LangGraph Supervisor
LangGraph Swarm
Language
Theme
Pythonlanggraphpregel_messagesStreamMessagesHandler
Class●Since v0.6

StreamMessagesHandler

Copy
StreamMessagesHandler(
  self,
  stream: Callable[[StreamChunk], None],
  subgraphs: bool,
  *

Bases

BaseCallbackHandler_StreamingCallbackHandler

Constructors

Attributes

Methods

Inherited fromBaseCallbackHandler(langchain_core)

Attributes

Araise_errorAignore_llmAignore_retryAignore_chain
View source on GitHub
,
parent_ns
:
tuple
[
str
,
.
.
.
]
|
None
=
None
)
A
ignore_agent
Aignore_retriever
Aignore_chat_model
Aignore_custom_event

Inherited fromChainManagerMixin(langchain_core)

Methods

Mon_agent_actionMon_agent_finish

Inherited fromToolManagerMixin(langchain_core)

Methods

Mon_tool_endMon_tool_error

Inherited fromRetrieverManagerMixin(langchain_core)

Methods

Mon_retriever_errorMon_retriever_end

Inherited fromCallbackManagerMixin(langchain_core)

Methods

Mon_llm_startMon_retriever_startMon_tool_start

Inherited fromRunManagerMixin(langchain_core)

Methods

Mon_textMon_retryMon_custom_event

Parameters

NameTypeDescription
stream*Callable[[StreamChunk], None]

A callable that takes a StreamChunk and emits it.

subgraphs*bool

Whether to emit messages from subgraphs.

parent_nstuple[str, ...] | None
Default:None
constructor
__init__
NameType
streamCallable[[StreamChunk], None]
subgraphsbool
parent_nstuple[str, ...] | None
attribute
run_inline: bool

We want this callback to run in the main thread to avoid order/locking issues.

attribute
stream: stream
attribute
subgraphs: subgraphs
attribute
metadata: dict[UUID, Meta]
attribute
seen: set[int | str]
attribute
parent_ns: parent_ns
method
tap_output_aiter
method
tap_output_iter
method
on_chat_model_start
method
on_llm_new_token
method
on_llm_end
method
on_llm_error
method
on_chain_start
method
on_chain_end
method
on_chain_error

A callback handler that implements stream_mode=messages.

Collects messages from: (1) chat model stream events; and (2) node outputs.

The namespace where the handler was created. We keep track of this namespace to allow calls to subgraphs that were explicitly requested as a stream with messages mode configured.