Draw the graph as a Mermaid string.
Optionalparams: {Parameters for the drawMermaid method.
OptionalbackgroundColor?: stringThe background color of the graph.
OptionalcurveStyle?: stringThe style of the graph's curves.
OptionalnodeColors?: Record<string, string>The colors of the graph's nodes.
OptionalwithStyles?: booleanWhether to include styles in the graph.
OptionalwrapLabelNWords?: numberThe maximum number of words to wrap in a node's label.
Mermaid string
Visualize the graph as a PNG image.
Optionalparams: {Parameters for the drawMermaidPng method.
OptionalbackgroundColor?: stringThe background color of the graph.
OptionalcurveStyle?: stringThe style of the graph's curves.
OptionalnodeColors?: Record<string, string>The colors of the graph's nodes.
OptionalwithStyles?: booleanWhether to include styles in the graph.
OptionalwrapLabelNWords?: numberThe maximum number of words to wrap in a node's label.
PNG image as a buffer
Executes the agent with the given state and returns the final state after all processing.
This method runs the agent's entire workflow synchronously, including:
The initial state for the agent execution. Can be:
messages array and any middleware-specific state propertiesOptionalconfig: InvokeConfiguration<Optional runtime configuration including:
The context for the agent execution.
LangGraph configuration options like thread_id, run_id, etc.
The store for the agent execution for persisting state, see more in Memory storage.
An optional AbortSignal for the agent execution.
The recursion limit for the agent execution.
A Promise that resolves to the final agent state after execution completes.
The returned state includes:
- a messages property containing an array with all messages (input, AI responses, tool calls/results)
- a structuredResponse property containing the structured response (if configured)
- all state values defined in the middleware
const agent = new ReactAgent({
llm: myModel,
tools: [calculator, webSearch],
responseFormat: z.object({
weather: z.string(),
}),
});
const result = await agent.invoke({
messages: [{ role: "human", content: "What's the weather in Paris?" }]
});
console.log(result.structuredResponse.weather); // outputs: "It's sunny and 75°F."
Executes the agent with streaming, returning an async iterable of state updates as they occur.
This method runs the agent's workflow similar to invoke, but instead of waiting for
completion, it streams high-level state updates in real-time. This allows you to:
For more granular event-level streaming (like individual LLM tokens), use streamEvents instead.
The initial state for the agent execution. Can be:
messages array and any middleware-specific state propertiesOptionalconfig: StreamConfiguration<Optional runtime configuration including:
The context for the agent execution.
LangGraph configuration options like thread_id, run_id, etc.
The store for the agent execution for persisting state, see more in Memory storage.
An optional AbortSignal for the agent execution.
The streaming mode for the agent execution, see more in Supported stream modes.
The recursion limit for the agent execution.
A Promise that resolves to an IterableReadableStream of state updates. Each update contains the current state after a node completes.
Get the compiled StateGraph.