LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • Client
  • AsyncClient
  • Run Helpers
  • Run Trees
  • Evaluation
  • Schemas
  • Utilities
  • Wrappers
  • Anonymizer
  • Testing
  • Expect API
  • Middleware
  • Pytest Plugin
  • Deployment SDK
  • RemoteGraph
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

OverviewClientAsyncClientRun HelpersRun TreesEvaluationSchemasUtilitiesWrappersAnonymizerTestingExpect APIMiddlewarePytest PluginDeployment SDKRemoteGraph
Language
Theme
PythonlangsmithschemasBatchIngestConfig
Class●Since v0.0

BatchIngestConfig

Configuration for batch ingestion.

Copy
BatchIngestConfig()

Bases

TypedDict

Constructors

constructor
__init__
NameType
use_multipart_endpointbool
scale_up_qsize_triggerint
scale_up_nthreads_limitint
scale_down_nempty_triggerint
size_limitint
size_limit_bytesOptional[int]

Attributes

attribute
use_multipart_endpoint: bool

Whether to use the multipart endpoint for batch ingestion.

attribute
scale_up_qsize_trigger: int

The queue size threshold that triggers scaling up.

attribute
scale_up_nthreads_limit: int

The maximum number of threads to scale up to.

attribute
scale_down_nempty_trigger: int

The number of empty threads that triggers scaling down.

attribute
size_limit: int

The maximum size limit for the batch.

attribute
size_limit_bytes: Optional[int]

The maximum size limit in bytes for the batch.

View source on GitHub