LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Caches
    • Callbacks
    • Documents
    • Document loaders
    • Embeddings
    • Exceptions
    • Language models
    • Serialization
    • Output parsers
    • Prompts
    • Rate limiters
    • Retrievers
    • Runnables
    • Utilities
    • Vector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewCachesCallbacksDocumentsDocument loadersEmbeddingsExceptionsLanguage modelsSerializationOutput parsersPromptsRate limitersRetrieversRunnablesUtilitiesVector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-corerunnablesbaseRunnableSequence
    Class●Since v0.1

    RunnableSequence

    Copy
    RunnableSequence(
      self,
      
    *
    steps
    :
    RunnableLike
    =
    (
    )
    ,
    name
    :
    str
    |
    None
    =
    None
    ,
    first
    :
    Runnable
    [
    Any
    ,
    Any
    ]
    |
    None
    =
    None
    ,
    middle
    :
    list
    [
    Runnable
    [
    Any
    ,
    Any
    ]
    ]
    |
    None
    =
    None
    ,
    last
    :
    Runnable
    [
    Any
    ,
    Any
    ]
    |
    None
    =
    None
    )

    Bases

    RunnableSerializable[Input, Output]

    Used in Docs

    • ChatSeekrFlow integration

    Constructors

    constructor
    __init__
    NameType
    namestr | None
    firstRunnable[Any, Any] | None
    middlelist[Runnable[Any

    Attributes

    attribute
    first: Runnable[Input, Any]
    attribute
    middle: list[Runnable[Any, Any]]
    attribute
    last: Runnable[Any, Output]
    attribute
    steps: list[Runnable[Any, Any]]
    attribute
    model_config
    attribute
    InputType: type[Input]
    attribute
    OutputType: type[Output]
    attribute
    config_specs: list[ConfigurableFieldSpec]

    Methods

    method
    get_lc_namespace
    method
    is_lc_serializable
    method
    get_input_schema
    method
    get_output_schema
    method
    get_graph
    method
    invoke
    method
    ainvoke
    method
    batch
    method
    abatch
    method
    transform
    method
    stream
    method
    atransform
    method
    astream

    Inherited fromRunnableSerializable

    Attributes

    Aname: str
    —

    The name of the function.

    Methods

    Mto_json
    —

    Convert the graph to a JSON-serializable format.

    Mconfigurable_fieldsMconfigurable_alternatives
    —

    Configure alternatives for Runnable objects that can be set at runtime.

    Inherited fromSerializable

    Attributes

    Alc_secrets: dict[str, str]
    —

    A map of constructor argument names to secret ids.

    Alc_attributes: dict
    —

    List of attribute names that should be included in the serialized kwargs.

    Methods

    Mlc_id
    —

    Return a unique identifier for this class for serialization purposes.

    Inherited fromRunnable

    Attributes

    Aname: str
    —

    The name of the function.

    Ainput_schema: type[BaseModel]
    —

    The type of input this Runnable accepts specified as a Pydantic model.

    Aoutput_schema: type[BaseModel]
    —

    Output schema.

    Methods

    View source on GitHub

    Sequence of Runnable objects, where the output of one is the input of the next.

    RunnableSequence is the most important composition operator in LangChain as it is used in virtually every chain.

    A RunnableSequence can be instantiated directly or more commonly by using the | operator where either the left or right operands (or both) must be a Runnable.

    Any RunnableSequence automatically supports sync, async, batch.

    The default implementations of batch and abatch utilize threadpools and asyncio gather and will be faster than naive invocation of invoke or ainvoke for IO bound Runnables.

    Batching is implemented by invoking the batch method on each component of the RunnableSequence in order.

    A RunnableSequence preserves the streaming properties of its components, so if all components of the sequence implement a transform method -- which is the method that implements the logic to map a streaming input to a streaming output -- then the sequence will be able to stream input to output!

    If any component of the sequence does not implement transform then the streaming will only begin after this component is run. If there are multiple blocking components, streaming begins after the last one.

    Note

    RunnableLambdas do not support transform by default! So if you need to use a RunnableLambdas be careful about where you place them in a RunnableSequence (if you need to use the stream/astream methods).

    If you need arbitrary logic and need streaming, you can subclass Runnable, and implement transform for whatever logic you need.

    Here is a simple example that uses simple functions to illustrate the use of RunnableSequence:

    from langchain_core.runnables import RunnableLambda
    
    def add_one(x: int) -> int:
        return x + 1
    
    def mul_two(x: int) -> int:
        return x * 2
    
    runnable_1 = RunnableLambda(add_one)
    runnable_2 = RunnableLambda(mul_two)
    sequence = runnable_1 | runnable_2
    # Or equivalently:
    # sequence = RunnableSequence(first=runnable_1, last=runnable_2)
    sequence.invoke(1)
    await sequence.ainvoke(1)
    
    sequence.batch([1, 2, 3])
    await sequence.abatch([1, 2, 3])

    Here's an example that uses streams JSON output generated by an LLM:

    from langchain_core.output_parsers.json import SimpleJsonOutputParser
    from langchain_openai import ChatOpenAI
    
    prompt = PromptTemplate.from_template(
        "In JSON format, give me a list of {topic} and their "
        "corresponding names in French, Spanish and in a "
        "Cat Language."
    )
    
    model = ChatOpenAI()
    chain = prompt | model | SimpleJsonOutputParser()
    
    async for chunk in chain.astream({"topic": "colors"}):
        print("-")  # noqa: T201
        print(chunk, sep="", flush=True)  # noqa: T201

    Parameters

    NameTypeDescription
    stepsRunnableLike
    Default:()
    namestr | None
    Default:None
    firstRunnable[Any, Any
    Mto_json
    —

    Convert the graph to a JSON-serializable format.

    Mto_json_not_implemented
    —

    Serialize a "not implemented" object.

    Mget_nameMget_input_jsonschema
    —

    Get a JSON schema that represents the input to the Runnable.

    Mget_output_jsonschema
    —

    Get a JSON schema that represents the output of the Runnable.

    Mconfig_schema
    —

    The type of config this Runnable accepts specified as a Pydantic model.

    Mget_config_jsonschema
    —

    Get a JSON schema that represents the config of the Runnable.

    Mget_prompts
    —

    Return a list of prompts used by this Runnable.

    Mpipe
    —

    Pipe Runnable objects.

    Mpick
    —

    Pick keys from the output dict of this Runnable.

    Massign
    —

    Merge the Dict input with the output produced by the mapping argument.

    Mbatch_as_completed
    —

    Run invoke in parallel on a list of inputs.

    Mabatch_as_completed
    —

    Run ainvoke in parallel on a list of inputs.

    Mastream_log
    —

    Stream all output from a Runnable, as reported to the callback system.

    Mastream_events
    —

    Generate a stream of events.

    Mbind
    —

    Bind arguments to a Runnable, returning a new Runnable.

    Mwith_configMwith_listeners
    —

    Bind lifecycle listeners to a Runnable, returning a new Runnable.

    Mwith_alisteners
    —

    Bind async lifecycle listeners to a Runnable.

    Mwith_types
    —

    Bind input and output types to a Runnable, returning a new Runnable.

    Mwith_retry
    —

    Create a new Runnable that retries the original Runnable on exceptions.

    Mmap
    —

    Map a function to multiple iterables.

    Mwith_fallbacks
    —

    Add fallbacks to a Runnable, returning a new Runnable.

    Mas_tool
    —

    Create a BaseTool from a Runnable.

    ] |
    None
    Default:None

    The first Runnable in the sequence.

    middlelist[Runnable[Any, Any]] | None
    Default:None

    The middle Runnable objects in the sequence.

    lastRunnable[Any, Any] | None
    Default:None

    The last Runnable in the sequence.

    ,
    Any
    ]] |
    None
    lastRunnable[Any, Any] | None

    The steps to include in the sequence.

    The name of the Runnable.

    The first Runnable in the sequence.

    The middle Runnable in the sequence.

    The last Runnable in the sequence.

    All the Runnables that make up the sequence in order.

    The type of the input to the Runnable.

    The type of the output of the Runnable.

    Get the config specs of the Runnable.

    Get the namespace of the LangChain object.

    Return True as this class is serializable.

    Get the input schema of the Runnable.

    Get the output schema of the Runnable.

    Get the graph representation of the Runnable.