Calculates the depth of a subagent based on its namespace.
Extracts the parent tool call ID from a namespace.
Extracts the tool call ID from a namespace path.
Checks if a namespace indicates a subagent/subgraph message.
A React hook that provides seamless integration with LangGraph streaming capabilities.
Helper to send and persist UI messages. Accepts a map of component names to React components
React SDK for building AI-powered applications with LangChain and LangGraph. Provides a useStream hook that manages streaming, state, branching, and interrupts out of the box.
npm install @langchain/react @langchain/core
Peer dependencies: react (^18 || ^19), react-dom (^18 || ^19), @langchain/core (^1.0.1)
import { useStream } from "@langchain/react";
function Chat() {
const { messages, submit, isLoading } = useStream({
assistantId: "agent",
apiUrl: "http://localhost:2024",
});
return (
<div>
{messages.map((msg, i) => (
<div key={msg.id ?? i}>{msg.content}</div>
))}
<button
disabled={isLoading}
onClick={() =>
void submit({
messages: [{ type: "human", content: "Hello!" }],
})
}
>
Send
</button>
</div>
);
}
useStream Options| Option | Type | Description |
|---|---|---|
assistantId |
string |
Required. The assistant/graph ID to stream from. |
apiUrl |
string |
Base URL of the LangGraph API. |
client |
Client |
Pre-configured Client instance (alternative to apiUrl). |
messagesKey |
string |
State key containing messages. Defaults to "messages". |
initialValues |
StateType |
Initial state values before any stream data arrives. |
fetchStateHistory |
boolean \| { limit: number } |
Fetch thread history on stream completion. Enables branching. |
throttle |
boolean \| number |
Throttle state updates for performance. |
onFinish |
(state, error?) => void |
Called when the stream completes. |
onError |
(error, state?) => void |
Called on stream errors. |
onThreadId |
(threadId) => void |
Called when a new thread is created. |
onUpdateEvent |
(event) => void |
Receive update events from the stream. |
onCustomEvent |
(event) => void |
Receive custom events from the stream. |
onStop |
() => void |
Called when the stream is stopped by the user. |
| Property | Type | Description |
|---|---|---|
values |
StateType |
Current graph state. |
messages |
Message[] |
Messages from the current state. |
isLoading |
boolean |
Whether a stream is currently active. |
error |
unknown |
The most recent error, if any. |
interrupt |
Interrupt \| undefined |
Current interrupt requiring user input. |
branch |
string |
Active branch identifier. |
submit(values, options?) |
function |
Submit new input to the graph. When called while a stream is active, the run is created on the server with multitaskStrategy: "enqueue" and queued automatically. |
stop() |
function |
Cancel the active stream. |
setBranch(branch) |
function |
Switch to a different conversation branch. |
getMessagesMetadata(msg, index?) |
function |
Get branching and checkpoint metadata for a message. |
switchThread(id) |
(id: string \| null) => void |
Switch to a different thread. Pass null to start a new thread on next submit. |
queue.entries |
ReadonlyArray<QueueEntry> |
Pending server-side runs. Each entry has id (server run ID), values, options, and createdAt. |
queue.size |
number |
Number of pending runs on the server. |
queue.cancel(id) |
(id: string) => Promise<boolean> |
Cancel a pending run on the server by its run ID. |
queue.clear() |
() => Promise<void> |
Cancel all pending runs on the server. |
createAgentWhen using createAgent, pass typeof agent to automatically infer tool call types:
import type { agent } from "./agent";
function Chat() {
const stream = useStream<typeof agent>({
assistantId: "agent",
apiUrl: "http://localhost:2024",
});
// stream.messages, tool calls, etc. are fully typed
}
StateGraphFor custom graphs, provide your state type directly:
import type { BaseMessage } from "langchain";
interface MyState {
messages: BaseMessage[];
context?: string;
}
function Chat() {
const { messages, submit } = useStream<MyState>({
assistantId: "my-graph",
apiUrl: "http://localhost:2024",
});
}
Pass interrupt types via the second generic parameter:
const { interrupt, submit } = useStream<
MyState,
{ InterruptType: { question: string } }
>({
assistantId: "my-graph",
apiUrl: "http://localhost:2024",
});
if (interrupt) {
// interrupt.value is typed as { question: string }
}
Interrupts let you pause graph execution and wait for user input:
function Chat() {
const { messages, interrupt, submit } = useStream<
{ messages: BaseMessage[] },
{ InterruptType: { question: string } }
>({
assistantId: "agent",
apiUrl: "http://localhost:2024",
});
return (
<div>
{messages.map((msg, i) => (
<div key={msg.id ?? i}>{msg.content}</div>
))}
{interrupt && (
<div>
<p>{interrupt.value.question}</p>
<button
onClick={() =>
void submit(null, { command: { resume: "Approved" } })
}
>
Approve
</button>
</div>
)}
<button
onClick={() =>
void submit({
messages: [{ type: "human", content: "Hello" }],
})
}
>
Send
</button>
</div>
);
}
Enable conversation branching by setting fetchStateHistory: true:
function Chat() {
const { messages, submit, getMessagesMetadata, setBranch } = useStream({
assistantId: "agent",
apiUrl: "http://localhost:2024",
fetchStateHistory: true,
});
return (
<div>
{messages.map((msg, i) => {
const metadata = getMessagesMetadata(msg, i);
const branchOptions = metadata?.branchOptions;
const branch = metadata?.branch;
return (
<div key={msg.id ?? i}>
<p>{msg.content}</p>
{branchOptions && branch && (
<div>
<button onClick={() => {
const prev = branchOptions[branchOptions.indexOf(branch) - 1];
if (prev) setBranch(prev);
}}>
Previous
</button>
<button onClick={() => {
const next = branchOptions[branchOptions.indexOf(branch) + 1];
if (next) setBranch(next);
}}>
Next
</button>
</div>
)}
</div>
);
})}
</div>
);
}
When submit() is called while a stream is already active, the SDK automatically creates the run on the server with multitaskStrategy: "enqueue". The pending runs are tracked in queue and processed in order as each finishes:
function Chat() {
const { messages, submit, isLoading, queue, switchThread } = useStream({
assistantId: "agent",
apiUrl: "http://localhost:2024",
});
return (
<div>
{messages.map((msg, i) => (
<div key={msg.id ?? i}>{msg.content}</div>
))}
{queue.size > 0 && (
<div>
<p>{queue.size} message(s) queued</p>
<button onClick={() => void queue.clear()}>Clear Queue</button>
</div>
)}
<button
disabled={isLoading}
onClick={() =>
void submit({
messages: [{ type: "human", content: "Hello!" }],
})
}
>
Send
</button>
<button onClick={() => switchThread(null)}>New Thread</button>
</div>
);
}
Switching threads via switchThread() cancels all pending runs and clears the queue.
Instead of connecting to a LangGraph API, you can provide your own streaming transport. Pass a transport object instead of assistantId to use a custom backend:
import { useStream, FetchStreamTransport } from "@langchain/react";
import type { BaseMessage } from "langchain";
function Chat() {
const {
messages,
submit,
isLoading,
branch,
setBranch,
getMessagesMetadata,
} = useStream<{ messages: BaseMessage[] }>({
transport: new FetchStreamTransport({
url: "https://my-api.example.com/stream",
}),
threadId: null,
onThreadId: (id) => console.log("Thread created:", id),
});
return (
<div>
{messages.map((msg, i) => {
const metadata = getMessagesMetadata(msg, i);
return (
<div key={msg.id ?? i}>
<p>{msg.content}</p>
{metadata?.streamMetadata && (
<span>Node: {metadata.streamMetadata.langgraph_node}</span>
)}
</div>
);
})}
<p>Current branch: {branch}</p>
<button
disabled={isLoading}
onClick={() =>
void submit({
messages: [{ type: "human", content: "Hello!" }],
})
}
>
Send
</button>
</div>
);
}
The custom transport interface returns the same properties as the standard useStream hook, including getMessagesMetadata, branch, setBranch, switchThread, and all message/interrupt/subagent helpers. When using a custom transport, getMessagesMetadata returns stream metadata sent alongside messages during streaming; branch and setBranch provide local branch state management.
The @langchain/react/react-ui sub-package provides utilities for rendering server-defined UI components:
import { useStreamContext, LoadExternalComponent } from "@langchain/react/react-ui";
import { uiMessageReducer } from "@langchain/react/react-ui";
import type { UIMessage } from "@langchain/react/react-ui";
useStreamContext - Access the stream context from deeply nested componentsLoadExternalComponent - Render UI components defined by the serveruiMessageReducer - Reducer for managing UI message stateA server-side helper is also available:
import { typedUi } from "@langchain/react/react-ui/server";
For complete end-to-end examples with full agentic UIs, visit the LangGraph Playground.
MIT