LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
LangGraph
  • Web
  • Channels
  • Pregel
  • Prebuilt
  • Remote
React SDK
Vue SDK
Svelte SDK
Angular SDK
LangGraph SDK
  • Ui
  • Client
  • Auth
  • React
  • Logging
  • React Ui
  • Utils
  • Server
LangGraph Checkpoint
LangGraph Checkpoint MongoDB
LangGraph Checkpoint Postgres
  • Store
LangGraph Checkpoint Redis
  • Shallow
  • Store
LangGraph Checkpoint SQLite
LangGraph Checkpoint Validation
  • Cli
LangGraph API
LangGraph CLI
LangGraph CUA
  • Utils
LangGraph Supervisor
LangGraph Swarm
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

LangGraph
WebChannelsPregelPrebuiltRemote
React SDK
Vue SDK
Svelte SDK
Angular SDK
LangGraph SDK
UiClientAuthReactLoggingReact UiUtilsServer
LangGraph Checkpoint
LangGraph Checkpoint MongoDB
LangGraph Checkpoint Postgres
Store
LangGraph Checkpoint Redis
ShallowStore
LangGraph Checkpoint SQLite
LangGraph Checkpoint Validation
Cli
LangGraph API
LangGraph CLI
LangGraph CUA
Utils
LangGraph Supervisor
LangGraph Swarm
Language
Theme
JavaScript@langchain/react

@langchain/react

Description

@langchain/react

React SDK for building AI-powered applications with Deep Agents, LangChain and LangGraph. It provides a useStream hook that manages streaming, state, branching, and interrupts out of the box.

Installation

npm install @langchain/react @langchain/core

Peer dependencies: react (^18 || ^19), @langchain/core (^1.1.27)

Quick Start

import { useStream } from "@langchain/react";

function Chat() {
  const { messages, submit, isLoading } = useStream({
    assistantId: "agent",
    apiUrl: "http://localhost:2024",
  });

  return (
    <div>
      {messages.map((msg, i) => (
        <div key={msg.id ?? i}>{msg.content}</div>
      ))}

      <button
        disabled={isLoading}
        onClick={() =>
          void submit({
            messages: [{ type: "human", content: "Hello!" }],
          })
        }
      >
        Send
      </button>
    </div>
  );
}

useStream Options

Option Type Description
assistantId string Required. The assistant/graph ID to stream from.
apiUrl string Base URL of the LangGraph API.
client Client Pre-configured Client instance (alternative to apiUrl).
messagesKey string State key containing messages. Defaults to "messages".
initialValues StateType Initial state values before any stream data arrives.
fetchStateHistory boolean \| { limit: number } Fetch thread history on stream completion. Enables branching.
throttle boolean \| number Throttle state updates for performance.
onFinish (state, error?) => void Called when the stream completes.
onError (error, state?) => void Called on stream errors.
onThreadId (threadId) => void Called when a new thread is created.
onUpdateEvent (event) => void Receive update events from the stream.
onCustomEvent (event) => void Receive custom events from the stream.
onStop () => void Called when the stream is stopped by the user.

Return Values

Property Type Description
values StateType Current graph state.
messages Message[] Messages from the current state.
isLoading boolean Whether a stream is currently active.
error unknown The most recent error, if any.
interrupt Interrupt \| undefined Current interrupt requiring user input.
branch string Active branch identifier.
submit(values, options?) function Submit new input to the graph. When called while a stream is active, the run is created on the server with multitaskStrategy: "enqueue" and queued automatically.
stop() function Cancel the active stream.
setBranch(branch) function Switch to a different conversation branch.
getMessagesMetadata(msg, index?) function Get branching and checkpoint metadata for a message.
switchThread(id) (id: string \| null) => void Switch to a different thread. Pass null to start a new thread on next submit.
queue.entries ReadonlyArray<QueueEntry> Pending server-side runs. Each entry has id (server run ID), values, options, and createdAt.
queue.size number Number of pending runs on the server.
queue.cancel(id) (id: string) => Promise<boolean> Cancel a pending run on the server by its run ID.
queue.clear() () => Promise<void> Cancel all pending runs on the server.

useSuspenseStream

useSuspenseStream is a companion hook to useStream that integrates with React's Suspense and Error Boundary protocols. Instead of handling loading and error states inside your component, you declare them in parent boundaries:

import { Suspense } from "react";
import { ErrorBoundary } from "react-error-boundary";
import { useSuspenseStream } from "@langchain/react";

function App() {
  return (
    <ErrorBoundary
      fallback={({ error, resetErrorBoundary }) => (
        <div>
          <p>{error.message}</p>
          <button onClick={resetErrorBoundary}>Retry</button>
        </div>
      )}
    >
      <Suspense fallback={<Spinner />}>
        <Chat />
      </Suspense>
    </ErrorBoundary>
  );
}

function Chat() {
  // No isLoading/error checks needed — Suspense and ErrorBoundary handle them.
  const { messages, submit, isStreaming } = useSuspenseStream({
    assistantId: "agent",
    apiUrl: "http://localhost:2024",
  });

  return (
    <div>
      {messages.map((msg, i) => (
        <div key={msg.id ?? i}>{msg.content}</div>
      ))}
      {isStreaming && <TypingIndicator />}

      <button
        onClick={() =>
          void submit({
            messages: [{ type: "human", content: "Hello!" }],
          })
        }
      >
        Send
      </button>
    </div>
  );
}

How it works

  • Suspends while the initial thread history is loading (e.g. when a threadId is provided and the thread data is being fetched). The nearest <Suspense> boundary renders its fallback during this time.
  • Throws errors to the nearest Error Boundary when the stream encounters an error outside of active streaming.
  • Does not suspend during streaming. Streaming is incremental — messages arrive progressively and the UI must update in real time. The isStreaming flag indicates whether tokens are currently arriving.

Options

useSuspenseStream accepts the same options as useStream (LangGraph Platform mode), plus:

Option Type Description
suspenseCache SuspenseCache Optional cache instance for Suspense history prefetching. Useful in tests to avoid cross-test cache sharing.

Return Values

The return type is identical to useStream except:

Removed Reason
isLoading Replaced by isStreaming; initial loading is handled by Suspense.
error Thrown to the nearest Error Boundary instead.
isThreadLoading Handled by Suspense (the component suspends until the thread is ready).
Added Type Description
isStreaming boolean true while the stream is receiving data. The component is never suspended during streaming.

All other properties (messages, submit, stop, interrupt, branch, switchThread, queue, etc.) are unchanged.

Thread-switching with Suspense

useSuspenseStream works naturally with thread switching. When the threadId changes, the component suspends while the new thread's history loads, and <Suspense> shows a smooth skeleton/fallback transition:

function App() {
  const [threadId, setThreadId] = useState<string | null>(null);

  return (
    <div className="flex">
      <ThreadSidebar onSelect={setThreadId} />

      <Suspense fallback={<ThreadSkeleton />}>
        <ChatPanel threadId={threadId} />
      </Suspense>
    </div>
  );
}

function ChatPanel({ threadId }: { threadId: string | null }) {
  const { messages, submit, isStreaming } = useSuspenseStream({
    assistantId: "agent",
    apiUrl: "http://localhost:2024",
    threadId,
  });

  return <MessageList messages={messages} />;
}

Error recovery

When an error is thrown to an Error Boundary, call invalidateSuspenseCache() in the boundary's reset handler so the retry triggers a fresh data fetch:

import { invalidateSuspenseCache } from "@langchain/react";

<ErrorBoundary
  onReset={() => invalidateSuspenseCache()}
  fallbackRender={({ error, resetErrorBoundary }) => (
    <div>
      <p>{error.message}</p>
      <button onClick={resetErrorBoundary}>Retry</button>
    </div>
  )}
>
  <Suspense fallback={<Spinner />}>
    <Chat />
  </Suspense>
</ErrorBoundary>

For test isolation, you can create and pass a dedicated cache instance:

import { createSuspenseCache, useSuspenseStream } from "@langchain/react";

const suspenseCache = createSuspenseCache();

function Chat() {
  const stream = useSuspenseStream({
    assistantId: "agent",
    apiUrl: "http://localhost:2024",
    suspenseCache,
  });
  // ...
}

Type Safety

With createAgent

When using createAgent, pass typeof agent to automatically infer tool call types:

import type { agent } from "./agent";

function Chat() {
  const stream = useStream<typeof agent>({
    assistantId: "agent",
    apiUrl: "http://localhost:2024",
  });

  // stream.messages, tool calls, etc. are fully typed
}

With StateGraph

For custom graphs, provide your state type directly:

import type { BaseMessage } from "langchain";

interface MyState {
  messages: BaseMessage[];
  context?: string;
}

function Chat() {
  const { messages, submit } = useStream<MyState>({
    assistantId: "my-graph",
    apiUrl: "http://localhost:2024",
  });
}

Typed Interrupts

Pass interrupt types via the second generic parameter:

const { interrupt, submit } = useStream<
  MyState,
  { InterruptType: { question: string } }
>({
  assistantId: "my-graph",
  apiUrl: "http://localhost:2024",
});

if (interrupt) {
  // interrupt.value is typed as { question: string }
}

Handling Interrupts

Interrupts let you pause graph execution and wait for user input:

function Chat() {
  const { messages, interrupt, submit } = useStream<
    { messages: BaseMessage[] },
    { InterruptType: { question: string } }
  >({
    assistantId: "agent",
    apiUrl: "http://localhost:2024",
  });

  return (
    <div>
      {messages.map((msg, i) => (
        <div key={msg.id ?? i}>{msg.content}</div>
      ))}

      {interrupt && (
        <div>
          <p>{interrupt.value.question}</p>
          <button
            onClick={() =>
              void submit(null, { command: { resume: "Approved" } })
            }
          >
            Approve
          </button>
        </div>
      )}

      <button
        onClick={() =>
          void submit({
            messages: [{ type: "human", content: "Hello" }],
          })
        }
      >
        Send
      </button>
    </div>
  );
}

Branching

Enable conversation branching by setting fetchStateHistory: true:

function Chat() {
  const { messages, submit, getMessagesMetadata, setBranch } = useStream({
    assistantId: "agent",
    apiUrl: "http://localhost:2024",
    fetchStateHistory: true,
  });

  return (
    <div>
      {messages.map((msg, i) => {
        const metadata = getMessagesMetadata(msg, i);
        const branchOptions = metadata?.branchOptions;
        const branch = metadata?.branch;

        return (
          <div key={msg.id ?? i}>
            <p>{msg.content}</p>
            {branchOptions && branch && (
              <div>
                <button onClick={() => {
                  const prev = branchOptions[branchOptions.indexOf(branch) - 1];
                  if (prev) setBranch(prev);
                }}>
                  Previous
                </button>
                <button onClick={() => {
                  const next = branchOptions[branchOptions.indexOf(branch) + 1];
                  if (next) setBranch(next);
                }}>
                  Next
                </button>
              </div>
            )}
          </div>
        );
      })}
    </div>
  );
}

Server-Side Queuing

When submit() is called while a stream is already active, the SDK automatically creates the run on the server with multitaskStrategy: "enqueue". The pending runs are tracked in queue and processed in order as each finishes:

function Chat() {
  const { messages, submit, isLoading, queue, switchThread } = useStream({
    assistantId: "agent",
    apiUrl: "http://localhost:2024",
  });

  return (
    <div>
      {messages.map((msg, i) => (
        <div key={msg.id ?? i}>{msg.content}</div>
      ))}

      {queue.size > 0 && (
        <div>
          <p>{queue.size} message(s) queued</p>
          <button onClick={() => void queue.clear()}>Clear Queue</button>
        </div>
      )}

      <button
        disabled={isLoading}
        onClick={() =>
          void submit({
            messages: [{ type: "human", content: "Hello!" }],
          })
        }
      >
        Send
      </button>
      <button onClick={() => switchThread(null)}>New Thread</button>
    </div>
  );
}

Switching threads via switchThread() cancels all pending runs and clears the queue.

Custom Transport

Instead of connecting to a LangGraph API, you can provide your own streaming transport. Pass a transport object instead of assistantId to use a custom backend:

import { useStream, FetchStreamTransport } from "@langchain/react";
import type { BaseMessage } from "langchain";

function Chat() {
  const {
    messages,
    submit,
    isLoading,
    branch,
    setBranch,
    getMessagesMetadata,
  } = useStream<{ messages: BaseMessage[] }>({
    transport: new FetchStreamTransport({
      url: "https://my-api.example.com/stream",
    }),
    threadId: null,
    onThreadId: (id) => console.log("Thread created:", id),
  });

  return (
    <div>
      {messages.map((msg, i) => {
        const metadata = getMessagesMetadata(msg, i);
        return (
          <div key={msg.id ?? i}>
            <p>{msg.content}</p>
            {metadata?.streamMetadata && (
              <span>Node: {metadata.streamMetadata.langgraph_node}</span>
            )}
          </div>
        );
      })}

      <p>Current branch: {branch}</p>

      <button
        disabled={isLoading}
        onClick={() =>
          void submit({
            messages: [{ type: "human", content: "Hello!" }],
          })
        }
      >
        Send
      </button>
    </div>
  );
}

The custom transport interface returns the same properties as the standard useStream hook, including getMessagesMetadata, branch, setBranch, switchThread, and all message/interrupt/subagent helpers. When using a custom transport, getMessagesMetadata returns stream metadata sent alongside messages during streaming; branch and setBranch provide local branch state management. onFinish is also supported and receives a synthetic ThreadState built from the final locally streamed values; the run metadata argument is undefined.

Sharing State with StreamProvider

When multiple components in a tree need access to the same stream (a message list, a header with loading status, an input bar), use StreamProvider and useStreamContext to avoid prop drilling:

import { StreamProvider, useStreamContext } from "@langchain/react";

function App() {
  return (
    <StreamProvider assistantId="agent" apiUrl="http://localhost:2024">
      <ChatHeader />
      <MessageList />
      <MessageInput />
    </StreamProvider>
  );
}

function ChatHeader() {
  const { isLoading, error } = useStreamContext();
  return (
    <header>
      <h1>Chat</h1>
      {isLoading && <span>Thinking...</span>}
      {error != null && <span>Error occurred</span>}
    </header>
  );
}

function MessageList() {
  const { messages, getMessagesMetadata } = useStreamContext();
  return (
    <div>
      {messages.map((msg, i) => (
        <div key={msg.id ?? i}>{msg.content}</div>
      ))}
    </div>
  );
}

function MessageInput() {
  const { submit, isLoading } = useStreamContext();
  return (
    <button
      disabled={isLoading}
      onClick={() =>
        void submit({
          messages: [{ type: "human", content: "Hello!" }],
        })
      }
    >
      Send
    </button>
  );
}

Type Safety with StreamProvider

Pass agent or state types to both StreamProvider and useStreamContext:

import type { agent } from "./agent";

function App() {
  return (
    <StreamProvider<typeof agent>
      assistantId="agent"
      apiUrl="http://localhost:2024"
    >
      <Chat />
    </StreamProvider>
  );
}

function Chat() {
  const { toolCalls } = useStreamContext<typeof agent>();
  // toolCalls are fully typed from the agent's tools
}

Multiple Agents

Nest providers for multi-agent scenarios — each subtree gets its own isolated stream:

function MultiAgentApp() {
  return (
    <div style={{ display: "grid", gridTemplateColumns: "1fr 1fr" }}>
      <StreamProvider assistantId="researcher" apiUrl="http://localhost:2024">
        <ResearchPanel />
      </StreamProvider>
      <StreamProvider assistantId="writer" apiUrl="http://localhost:2024">
        <WriterPanel />
      </StreamProvider>
    </div>
  );
}

Playground

For complete end-to-end examples with full agentic UIs, visit the LangChain UI Playground.

License

MIT

Classes

Class

FetchStreamTransport

Transport used to stream the thread.

Class

SubagentManager

Manages subagent execution state.

Functions

Function

calculateDepthFromNamespace

Calculates the depth of a subagent based on its namespace.

Function

createSuspenseCache

Function

executeHeadlessTool

Function

extractParentIdFromNamespace

Extracts the parent tool call ID from a namespace.

Function

extractToolCallIdFromNamespace

Extracts the tool call ID from a namespace path.

Function

filterOutHeadlessToolInterrupts

Strip headless-tool interrupts from a user-facing interrupt list.

Function

findHeadlessTool

Function

flushPendingHeadlessToolInterrupts

Execute and resume all newly seen headless-tool interrupts from a values

Function

handleHeadlessToolInterrupt

Function

headlessToolResumeCommand

Function

invalidateSuspenseCache

Clear the internal Suspense cache used by useSuspenseStream.

Function

isHeadlessToolInterrupt

Function

isSubagentNamespace

Checks if a namespace indicates a subagent/subgraph message.

Function

parseHeadlessToolInterruptPayload

Parses a headless-tool interrupt value from the graph. Accepts both

Function

StreamProvider

Provides a shared useStream instance to all descendants via React Context.

Function

useStream

A React hook that provides seamless integration with LangGraph streaming capabilities.

Function

useStreamContext

Accesses the shared stream instance from the nearest StreamProvider.

Function

useSuspenseStream

A Suspense-compatible variant of useStream for LangGraph Platform.

Interfaces

Types

Interface

AgentTypeConfigLike

Minimal interface matching the structure of AgentTypeConfig from @langchain/langgraph.

Interface

BaseStream

Base stream interface shared by all stream types.

Interface

CompiledSubAgentLike

Minimal interface matching the structure of a CompiledSubAgent from deepagents.

Interface

DeepAgentTypeConfigLike

Minimal interface matching the structure of DeepAgentTypeConfig from deepagents.

Interface

FlushPendingHeadlessToolInterruptsOptions

Interface

HeadlessToolImplementation

Client-side implementation returned by headlessTool.implement(...).

Interface

HeadlessToolInterrupt

Represents a headless tool interrupt payload emitted by LangChain's

Interface

QueueEntry

A single queued submission entry representing a server-side pending run.

Interface

QueueInterface

Reactive interface exposed to framework consumers for observing

Interface

SubAgentLike

Minimal interface matching the structure of a SubAgent from deepagents.

Interface

SubagentStreamInterface

Base interface for a single subagent stream.

Interface

SubagentToolCall

Represents a tool call that initiated a subagent.

Interface

ToolEvent

Interface

UseAgentStream

Stream interface for ReactAgent instances created with createAgent.

Interface

UseAgentStreamOptions

Options for configuring an agent stream.

Interface

UseDeepAgentStream

Stream interface for DeepAgent instances created with createDeepAgent.

Interface

UseDeepAgentStreamOptions

Options for configuring a deep agent stream.

Interface

UseStream

Interface

UseStreamOptions

Interface

UseStreamThread

Interface

UseStreamTransport

Transport used to stream the thread.

Type

AnyHeadlessToolImplementation

Type

BaseSubagentState

Base state type for subagents.

Type

ClassSubagentStreamInterface

Subagent stream interface with messages typed as BaseMessage[]

Type

DefaultSubagentStates

Default subagent state map used when no specific subagent types are provided.

Type

DefaultToolCall

Default tool call type when no specific tool definitions are provided.

Type

ExtractAgentConfig

Extract the AgentTypeConfig from an agent-like type.

Type

ExtractDeepAgentConfig

Extract the DeepAgentTypeConfig from a DeepAgent-like type.

Type

ExtractSubAgentMiddleware

Helper type to extract middleware from a SubAgent definition.

Type

GetToolCallsType

Extract the tool call type from a StateType's messages property.

Type

InferAgentToolCalls

Extract tool calls type from an agent's tools.

Type

InferBag

Infer the Bag type from an agent, defaulting to the provided Bag.

Type

InferDeepAgentSubagents

Extract the Subagents array type from a DeepAgent.

Type

InferNodeNames

Infer the node names from a compiled graph.

Type

InferStateType

Infer the state type from an agent, graph, or direct state type.

Type

InferSubagentByName

Helper type to extract a subagent by name from a DeepAgent.

Type

InferSubagentNames

Extract all subagent names as a string union from a DeepAgent.

Type

InferSubagentState

Infer the state type for a specific subagent by extracting and merging

Type

InferSubagentStates

Infer subagent state map from a DeepAgent.

Type

InferToolCalls

Infer tool call types from an agent.

Type

IsAgentLike

Check if a type is agent-like (has ~agentTypes phantom property).

Type

IsDeepAgentLike

Check if a type is a DeepAgent (has ~deepAgentTypes phantom property).

Type

MessageMetadata

Type

OnToolCallback

Type

ResolveStreamInterface

Resolves the appropriate stream interface based on the agent/graph type.

Type

ResolveStreamOptions

Resolves the appropriate options interface based on the agent/graph type.

Type

StreamProviderCustomProps

Props for the StreamProvider component when using a custom transport.

Type

StreamProviderProps

Props for the StreamProvider component.

Type

SubagentStateMap

Create a map of subagent names to their state types.

Type

SubagentStatus

The execution status of a subagent.

Type

SubagentStream

Represents a single subagent stream.

Type

SuspenseCache

Type

ToolCallFromTool

Infer a tool call type from a single tool.

Type

ToolCallsFromTools

Infer a union of tool call types from an array of tools.

Type

ToolCallState

The lifecycle state of a tool call.

Type

ToolCallWithResult

Type

UseStreamCustom

Type

UseStreamCustomOptions

Type

UseSuspenseStream

Return type for useSuspenseStream.