LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicschema
    Module●Since v1.0

    schema

    Schemas are the LangChain Base Classes and Interfaces.

    Attributes

    attribute
    RUN_KEY: str
    attribute
    Memory: BaseMemory

    Classes

    deprecatedclass
    BaseMemory

    Abstract base class for memory in Chains.

    Memory refers to state in Chains. Memory can be used to store information about past executions of a Chain and inject that information into the inputs of future executions of the Chain. For example, for conversational Chains Memory can be used to store conversations and automatically add them to future model prompts so that the model has the necessary context to respond coherently to the latest input.

    Modules

    module
    embeddings
    module
    document
    module
    storage
    module
    agent
    module
    vectorstore
    module
    memory
    module
    chat
    module
    retriever
    module
    output
    module
    output_parser
    module
    prompt
    module
    messages
    module
    exceptions
    module
    cache
    module
    chat_history
    module
    language_model
    module
    prompt_template
    module
    runnable

    LangChain Runnable and the LangChain Expression Language (LCEL).

    The LangChain Expression Language (LCEL) offers a declarative method to build production-grade programs that harness the power of LLMs.

    Programs created using LCEL and LangChain Runnables inherently support synchronous, asynchronous, batch, and streaming operations.

    Support for async allows servers hosting LCEL based programs to scale better for higher concurrent loads.

    Streaming of intermediate outputs as they're being generated allows for creating more responsive UX.

    This module contains schema and implementation of LangChain Runnables primitives.

    module
    callbacks
    View source on GitHub