LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
LangChain
  • Browser
  • Universal
  • Hub
  • Node
  • Load
  • Serializable
  • Encoder Backed
  • File System
  • In Memory
  • Tools
LangChain Core
  • Agents
  • Caches
  • Base
  • Dispatch
  • Web
  • Manager
  • Promises
  • Chat History
  • Context
  • Base
  • Langsmith
  • Documents
  • Embeddings
  • Errors
  • Example Selectors
  • Indexing
  • Base
  • Chat Models
  • Llms
  • Profile
  • Structured Output
  • Load
  • Serializable
  • Memory
  • Messages
  • Tool
  • Output Parsers
  • Openai Functions
  • Openai Tools
  • Outputs
  • Prompt Values
  • Prompts
  • Retrievers
  • Document Compressors
  • Runnables
  • Graph
  • Singletons
  • Stores
  • Structured Query
  • Testing
  • Tools
  • Base
  • Console
  • Log Stream
  • Run Collector
  • Tracer Langchain
  • Stream
  • Async Caller
  • Chunk Array
  • Context
  • Env
  • Event Source Parse
  • Format
  • Function Calling
  • Hash
  • Json Patch
  • Json Schema
  • Math
  • Ssrf
  • Standard Schema
  • Stream
  • Testing
  • Tiktoken
  • Types
  • Vectorstores
Text Splitters
MCP Adapters
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

LangChain
BrowserUniversalHubNodeLoadSerializableEncoder BackedFile SystemIn MemoryTools
LangChain Core
AgentsCachesBaseDispatchWebManagerPromisesChat HistoryContextBaseLangsmithDocumentsEmbeddingsErrorsExample SelectorsIndexingBaseChat ModelsLlmsProfileStructured OutputLoadSerializableMemoryMessagesToolOutput ParsersOpenai FunctionsOpenai ToolsOutputsPrompt ValuesPromptsRetrieversDocument CompressorsRunnablesGraphSingletonsStoresStructured QueryTestingToolsBaseConsoleLog StreamRun CollectorTracer LangchainStreamAsync CallerChunk ArrayContextEnvEvent Source ParseFormatFunction CallingHashJson PatchJson SchemaMathSsrfStandard SchemaStreamTestingTiktokenTypesVectorstores
Text Splitters
MCP Adapters
Language
Theme
JavaScriptlangchaintools
Moduleā—Since v0.3

tools

LangChain Tools

This module provides tool utilities for LangChain agents.

Copy
import { ... } from "langchain/tools";

Used in Docs

  • Anthropic integration
  • OpenAI integration

Type Aliases

Variables

View source on GitHub
typeAlias
HeadlessTool: DynamicStructuredTool<SchemaT, InferInteropZodOutput<SchemaT>, InferInteropZodInput<SchemaT>, unknown, unknown, NameT> & __type
typeAlias
HeadlessToolFields
typeAlias
HeadlessToolImplementation
variable
tool: HeadlessToolOverload & typeof coreTool

A headless tool that always interrupts agent execution on the server.

The implementation is provided separately on the client via useStream({ tools: [...] }) using .implement().

Configuration fields for creating a headless tool.

A tool implementation that pairs a headless tool with its execution function.

Created by calling .implement() on a HeadlessTool. Pass to useStream({ tools: [...] }) on the client side.

Unified tool primitive for LangChain agents.

Enhances the tool function from @langchain/core/tools with a headless overload: when called without an implementation function, the tool interrupts agent execution and lets the client supply the implementation.


Normal tool — pass an implementation function as the first argument:

import { tool } from "langchain/tools";
import { z } from "zod";

const getWeather = tool(
  async ({ city }) => `The weather in ${city} is sunny.`,
  {
    name: "get_weather",
    description: "Get the weather for a city",
    schema: z.object({ city: z.string() }),
  }
);

Headless tool — omit the implementation; the client provides it later:

import { tool } from "langchain/tools";
import { z } from "zod";

// Server: define the tool shape — no implementation needed
export const getLocation = tool({
  name: "get_location",
  description: "Get the user's current GPS location",
  schema: z.object({
    highAccuracy: z.boolean().optional().describe("Request high accuracy GPS"),
  }),
});

// Server: register with the agent
const agent = createAgent({
  model: "openai:gpt-4o",
  tools: [getLocation],
});

// Client: provide the implementation in useStream
const stream = useStream({
  assistantId: "agent",
  tools: [
    getLocation.implement(async ({ highAccuracy }) => {
      return new Promise((resolve, reject) => {
        navigator.geolocation.getCurrentPosition(
          (pos) => resolve({
            latitude: pos.coords.latitude,
            longitude: pos.coords.longitude,
          }),
          (err) => reject(new Error(err.message)),
          { enableHighAccuracy: highAccuracy }
        );
      });
    }),
  ],
});