LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Caches
    • Callbacks
    • Documents
    • Document loaders
    • Embeddings
    • Exceptions
    • Language models
    • Serialization
    • Output parsers
    • Prompts
    • Rate limiters
    • Retrievers
    • Runnables
    • Utilities
    • Vector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewCachesCallbacksDocumentsDocument loadersEmbeddingsExceptionsLanguage modelsSerializationOutput parsersPromptsRate limitersRetrieversRunnablesUtilitiesVector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-corelanguage_modelschat_model_streamAsyncChatModelStreamaclose
    Method●Since v1.3

    aclose

    Copy
    aclose(
        self,
    ) -> None
    View source on GitHub

    Cancel the background producer task and release resources.

    If a consumer cancels mid-stream or decides to stop iterating early, the producer task keeps pumping the provider HTTP call to completion because asyncio.Task has no implicit link to its awaiter. Call this method to cancel the producer explicitly; the stream transitions to an errored state with CancelledError.

    If the stream has already produced a message successfully (for example, after await stream.output), the producer may still be running post-stream work such as on_llm_end callbacks. In that case aclose() awaits the task rather than cancelling it — turning a successful run into a cancelled one would drop the end callback and corrupt tracing.

    Idempotent: safe to call multiple times, including after the stream has finished normally. Also invoked by the async context manager protocol on __aexit__.