LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
LangChain
  • Browser
  • Universal
  • Hub
  • Node
  • Load
  • Serializable
  • Encoder Backed
  • File System
  • In Memory
LangChain Core
  • Agents
  • Caches
  • Base
  • Dispatch
  • Web
  • Manager
  • Promises
  • Chat History
  • Context
  • Base
  • Langsmith
  • Documents
  • Embeddings
  • Errors
  • Example Selectors
  • Indexing
  • Base
  • Chat Models
  • Llms
  • Profile
  • Structured Output
  • Load
  • Serializable
  • Memory
  • Messages
  • Tool
  • Output Parsers
  • Openai Functions
  • Openai Tools
  • Outputs
  • Prompt Values
  • Prompts
  • Retrievers
  • Document Compressors
  • Runnables
  • Graph
  • Singletons
  • Stores
  • Structured Query
  • Testing
  • Tools
  • Base
  • Console
  • Log Stream
  • Run Collector
  • Tracer Langchain
  • Stream
  • Async Caller
  • Chunk Array
  • Context
  • Env
  • Event Source Parse
  • Format
  • Function Calling
  • Hash
  • Json Patch
  • Json Schema
  • Math
  • Ssrf
  • Standard Schema
  • Stream
  • Testing
  • Tiktoken
  • Types
  • Vectorstores
Text Splitters
MCP Adapters
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

LangChain
BrowserUniversalHubNodeLoadSerializableEncoder BackedFile SystemIn Memory
LangChain Core
AgentsCachesBaseDispatchWebManagerPromisesChat HistoryContextBaseLangsmithDocumentsEmbeddingsErrorsExample SelectorsIndexingBaseChat ModelsLlmsProfileStructured OutputLoadSerializableMemoryMessagesToolOutput ParsersOpenai FunctionsOpenai ToolsOutputsPrompt ValuesPromptsRetrieversDocument CompressorsRunnablesGraphSingletonsStoresStructured QueryTestingToolsBaseConsoleLog StreamRun CollectorTracer LangchainStreamAsync CallerChunk ArrayContextEnvEvent Source ParseFormatFunction CallingHashJson PatchJson SchemaMathSsrfStandard SchemaStreamTestingTiktokenTypesVectorstores
Text Splitters
MCP Adapters
Language
Theme
JavaScriptlangchainbrowser
Module●Since v1.2

browser

Copy
import { ... } from "langchain/browser";

Functions

function
context→ string
function
fakeModel→ FakeBuiltModel
function
filterMessages→ Runnable<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]>
function
initChatModel→ Promise<ConfigurableModel<RunInput, CallOptions>>
function
tool→ DynamicTool<InferToolOutputFromFunc<FuncT>, InferToolEventFromFunc<FuncT>>
function
trimMessages→ Runnable<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]>

Classes

class
AIMessage
class
AIMessageChunk
class
BaseMessage
class
BaseMessageChunk
class
Document
class
DynamicStructuredTool
class
DynamicTool
class
HumanMessage
class
HumanMessageChunk
class
InMemoryStore
class
StructuredTool
class
SystemMessage
class
SystemMessageChunk
class
Tool
class
ToolMessage
class
ToolMessageChunk

Interfaces

interface
ContentBlock
interface
DocumentInput
interface
LangChainMatchers

Type Aliases

Variables

View source on GitHub
typeAlias
ToolRuntime: RunnableConfig & __type

Runtime context automatically injected into tools.

When a tool function has a parameter named tool_runtime with type hint ToolRuntime, the tool execution system will automatically inject an instance containing:

  • state: The current graph state
  • toolCallId: The ID of the current tool call
  • config: RunnableConfig for the current execution
  • context: Runtime context
  • store: BaseStore instance for persistent storage
  • writer: Stream writer for streaming output

No Annotated wrapper is needed - just use runtime: ToolRuntime as a parameter.

variable
langchainMatchers: __type

All matcher functions bundled for convenient use with expect.extend().

LangChain utilities

LangChain Testing Utilities

LangChain Messages

Initialize a ChatModel from the model name and provider. Must have the integration package corresponding to the model provider installed.

LangChain Tools

LangChain Messages

LangChain Messages

Represents a chunk of an AI message, which can be concatenated with other AI message chunks.

Base class for all types of messages in a conversation. It includes properties like content, name, and additional_kwargs. It also includes methods like toDict() and _getType().

Represents a chunk of a message, which can be concatenated with other message chunks. It includes a method _merge_kwargs_dict() for merging additional keyword arguments from another BaseMessageChunk into this one. It also overrides the __add__() method to support concatenation of BaseMessageChunk instances.

Interface for interacting with a document.

A tool that can be created dynamically from a function, name, and description, designed to work with structured data. It extends the StructuredTool class and overrides the _call method to execute the provided function when the tool is called.

Schema can be passed as Zod or JSON schema. The tool will not validate input if JSON schema is passed.

A tool that can be created dynamically from a function, name, and description.

Represents a human message in a conversation.

Represents a chunk of a human message, which can be concatenated with other human message chunks.

In-memory implementation of the BaseStore using a dictionary. Used for storing key-value pairs in memory.

Base class for Tools that accept input of any shape defined by a Zod schema.

Represents a system message in a conversation.

Represents a chunk of a system message, which can be concatenated with other system message chunks.

Base class for Tools that accept input as a string.

Represents a tool message in a conversation.

Represents a chunk of a tool message, which can be concatenated with other tool message chunks.

LangChain Messages

LangChain Documents

LangChain Testing Utilities