LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
LangChain
  • Universal
  • Hub
  • Node
  • Load
  • Serializable
  • Encoder Backed
  • File System
  • In Memory
LangChain Core
  • Agents
  • Caches
  • Base
  • Dispatch
  • Web
  • Manager
  • Promises
  • Chat History
  • Context
  • Base
  • Langsmith
  • Documents
  • Embeddings
  • Errors
  • Example Selectors
  • Indexing
  • Base
  • Chat Models
  • Llms
  • Profile
  • Load
  • Serializable
  • Memory
  • Messages
  • Tool
  • Output Parsers
  • Openai Functions
  • Openai Tools
  • Outputs
  • Prompt Values
  • Prompts
  • Retrievers
  • Document Compressors
  • Runnables
  • Graph
  • Singletons
  • Stores
  • Structured Query
  • Tools
  • Base
  • Console
  • Log Stream
  • Run Collector
  • Tracer Langchain
  • Stream
  • Async Caller
  • Chunk Array
  • Context
  • Env
  • Event Source Parse
  • Format
  • Function Calling
  • Hash
  • Json Patch
  • Json Schema
  • Math
  • Ssrf
  • Stream
  • Testing
  • Tiktoken
  • Types
  • Vectorstores
Text Splitters
MCP Adapters
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

LangChain
UniversalHubNodeLoadSerializableEncoder BackedFile SystemIn Memory
LangChain Core
AgentsCachesBaseDispatchWebManagerPromisesChat HistoryContextBaseLangsmithDocumentsEmbeddingsErrorsExample SelectorsIndexingBaseChat ModelsLlmsProfileLoadSerializableMemoryMessagesToolOutput ParsersOpenai FunctionsOpenai ToolsOutputsPrompt ValuesPromptsRetrieversDocument CompressorsRunnablesGraphSingletonsStoresStructured QueryToolsBaseConsoleLog StreamRun CollectorTracer LangchainStreamAsync CallerChunk ArrayContextEnvEvent Source ParseFormatFunction CallingHashJson PatchJson SchemaMathSsrfStreamTestingTiktokenTypesVectorstores
Text Splitters
MCP Adapters
Language
Theme
JavaScript@langchain/coreutilsasync_callerAsyncCaller
Classā—Since v1.0

AsyncCaller

A class that can be used to make async calls with concurrency and retry logic.

This is useful for making calls to any kind of "expensive" external resource, be it because it's rate-limited, subject to network issues, etc.

Concurrent calls are limited by the maxConcurrency parameter, which defaults to Infinity. This means that by default, all calls will be made in parallel.

Retries are limited by the maxRetries parameter, which defaults to 6. This means that by default, each call will be retried up to 6 times, with an exponential backoff between each attempt.

Copy
class AsyncCaller

Constructors

constructor
constructor

Properties

property
maxConcurrency: number | undefined

Maximum number of parallel calls to make.

property
maxRetries: number | undefined

The maximum number of retries that can be made for a single call, with an exponential backoff between each attempt. Defaults to 6.

property
onFailedAttempt: FailedAttemptHandler | undefined

Custom handler to handle failed attempts. Takes the originally thrown error object as input, and should itself throw an error if the input error is not retryable.

Methods

method
callWithOptions→ Promise<Awaited<ReturnType<T>>>
method
fetch→ Promise<Response>
deprecatedmethod
call→ Promise<ToolReturnType<NonNullable<TArg>, TConfig, ToolOutputT>>
View source on GitHub