LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classiccallbacksstreamlitStreamlitCallbackHandler
    Function●Since v1.0

    StreamlitCallbackHandler

    Callback Handler that writes to a Streamlit app.

    This CallbackHandler is geared towards use with a LangChain Agent; it displays the Agent's LLM and tool-usage "thoughts" inside a series of Streamlit expanders.

    Parameters

    parent_container The st.container that will contain all the Streamlit elements that the Handler creates. max_thought_containers The max number of completed LLM thought containers to show at once. When this threshold is reached, a new thought will cause the oldest thoughts to be collapsed into a "History" expander. expand_new_thoughts Each LLM "thought" gets its own st.expander. This param controls whether that expander is expanded by default. collapse_completed_thoughts If True, LLM thought expanders will be collapsed when completed. thought_labeler An optional custom LLMThoughtLabeler instance. If unspecified, the handler will use the default thought labeling logic.

    Returns:

    A new StreamlitCallbackHandler instance.

    Note that this is an "auto-updating" API: if the installed version of Streamlit has a more recent StreamlitCallbackHandler implementation, an instance of that class will be used.

    Copy
    StreamlitCallbackHandler(
      parent_container: DeltaGenerator,
      *,
      max_thought_containers: int = 4,
      expand_new_thoughts: bool = True,
      collapse_completed_thoughts: bool = True,
      thought_labeler: LLMThoughtLabeler | None = None
    ) -> BaseCallbackHandler

    Used in Docs

    • Gpt4all integrations
    • Streamlit integration
    • Streamlit integrations
    View source on GitHub