LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
LangChain
  • Browser
  • Universal
  • Hub
  • Node
  • Load
  • Serializable
  • Encoder Backed
  • File System
  • In Memory
  • Tools
LangChain Core
  • Agents
  • Caches
  • Base
  • Dispatch
  • Web
  • Manager
  • Promises
  • Chat History
  • Context
  • Base
  • Langsmith
  • Documents
  • Embeddings
  • Errors
  • Example Selectors
  • Indexing
  • Base
  • Chat Models
  • Llms
  • Profile
  • Structured Output
  • Load
  • Serializable
  • Memory
  • Messages
  • Tool
  • Output Parsers
  • Openai Functions
  • Openai Tools
  • Outputs
  • Prompt Values
  • Prompts
  • Retrievers
  • Document Compressors
  • Runnables
  • Graph
  • Singletons
  • Stores
  • Structured Query
  • Testing
  • Tools
  • Base
  • Console
  • Log Stream
  • Run Collector
  • Tracer Langchain
  • Stream
  • Async Caller
  • Chunk Array
  • Context
  • Env
  • Event Source Parse
  • Format
  • Function Calling
  • Hash
  • Json Patch
  • Json Schema
  • Math
  • Ssrf
  • Standard Schema
  • Stream
  • Testing
  • Tiktoken
  • Types
  • Vectorstores
Text Splitters
MCP Adapters
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

LangChain
BrowserUniversalHubNodeLoadSerializableEncoder BackedFile SystemIn MemoryTools
LangChain Core
AgentsCachesBaseDispatchWebManagerPromisesChat HistoryContextBaseLangsmithDocumentsEmbeddingsErrorsExample SelectorsIndexingBaseChat ModelsLlmsProfileStructured OutputLoadSerializableMemoryMessagesToolOutput ParsersOpenai FunctionsOpenai ToolsOutputsPrompt ValuesPromptsRetrieversDocument CompressorsRunnablesGraphSingletonsStoresStructured QueryTestingToolsBaseConsoleLog StreamRun CollectorTracer LangchainStreamAsync CallerChunk ArrayContextEnvEvent Source ParseFormatFunction CallingHashJson PatchJson SchemaMathSsrfStandard SchemaStreamTestingTiktokenTypesVectorstores
Text Splitters
MCP Adapters
Language
Theme
JavaScriptlangchainindexReactAgentinvoke
Method●Since v1.0

invoke

Executes the agent with the given state and returns the final state after all processing.

This method runs the agent's entire workflow synchronously, including:

  • Processing the input messages through any configured middleware
  • Calling the language model to generate responses
  • Executing any tool calls made by the model
  • Running all middleware hooks (beforeModel, afterModel, etc.)
Copy
invoke(
  state: InvokeStateParameter<Types>,
  config: InvokeConfiguration<InferContextInput<Types["Context"] extends InteropZodObject | AnyAnnotationRoot  any[any] : AnyAnnotationRoot>  InferMiddlewareContextInputs<Types["Middleware"]>>
): Promise<MergedAgentState<Types>>

Parameters

NameTypeDescription
state*InvokeStateParameter<Types>

The initial state for the agent execution. Can be:

  • An object containing messages array and any middleware-specific state properties
  • A Command object for more advanced control flow
configInvokeConfiguration<InferContextInput<Types["Context"] extends InteropZodObject | AnyAnnotationRoot ? any[any] : AnyAnnotationRoot> & InferMiddlewareContextInputs<Types["Middleware"]>>

Optional runtime configuration including:

Example

Copy
const agent = new ReactAgent({
  llm: myModel,
  tools: [calculator, webSearch],
  responseFormat: z.object({
    weather: z.string(),
  }),
});

const result = await agent.invoke({
  messages: [{ role: "human", content: "What's the weather in Paris?" }]
});

console.log(result.structuredResponse.weather); // outputs: "It's sunny and 75°F."
View source on GitHub