LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Graphs
    • Functional API
    • Pregel
    • Checkpointing
    • Storage
    • Caching
    • Types
    • Runtime
    • Config
    • Errors
    • Constants
    • Channels
    • Agents
    LangGraph CLI
    LangGraph SDK
    LangGraph Supervisor
    LangGraph Swarm
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewGraphsFunctional APIPregelCheckpointingStorageCachingTypesRuntimeConfigErrorsConstantsChannelsAgents
    LangGraph CLI
    LangGraph SDK
    LangGraph Supervisor
    LangGraph Swarm
    Language
    Theme
    PythonlanggraphruntimeRuntime
    Class●Since v0.6

    Runtime

    Copy
    Runtime(
      self,
      *,
      context: 
    ContextT
    =
    None
    ,
    store
    :
    BaseStore
    |
    None
    =
    None
    ,
    stream_writer
    :
    StreamWriter
    =
    _no_op_stream_writer
    ,
    previous
    :
    Any
    =
    None
    ,
    execution_info
    :
    ExecutionInfo
    |
    None
    =
    None
    ,
    server_info
    :
    ServerInfo
    |
    None
    =
    None
    )

    Bases

    Generic[ContextT]

    Used in Docs

    • A2A endpoint in Agent Server
    • Custom middleware
    • Going to production
    • Graph API overview
    • Guardrails

    Constructors

    constructor
    __init__
    NameType
    contextContextT
    storeBaseStore | None
    stream_writerStreamWriter
    previousAny
    execution_infoExecutionInfo | None

    Attributes

    attribute
    context: ContextT
    attribute
    store: BaseStore | None
    attribute
    stream_writer: StreamWriter
    attribute
    previous: Any
    attribute
    execution_info: ExecutionInfo | None
    attribute
    server_info: ServerInfo | None

    Methods

    method
    merge
    method
    override
    method
    patch_execution_info
    View source on GitHub

    Convenience class that bundles run-scoped context and other runtime utilities.

    This class is injected into graph nodes and middleware. It provides access to context, store, stream_writer, previous, and execution_info.

    Accessing config

    Runtime does not include config. To access RunnableConfig, you can inject it directly by adding a config: RunnableConfig parameter to your node function (recommended), or use get_config() from langgraph.config.

    Note

    ToolRuntime (from langgraph.prebuilt) is a subclass that provides similar functionality but is designed specifically for tools. It shares context, store, and stream_writer with Runtime, and adds tool-specific attributes like config, state, and tool_call_id.

    Example:

    from typing import TypedDict
    from langgraph.graph import StateGraph
    from dataclasses import dataclass
    from langgraph.runtime import Runtime
    from langgraph.store.memory import InMemoryStore
    
    @dataclass
    class Context:  # (1)!
        user_id: str
    
    class State(TypedDict, total=False):
        response: str
    
    store = InMemoryStore()  # (2)!
    store.put(("users",), "user_123", {"name": "Alice"})
    
    def personalized_greeting(state: State, runtime: Runtime[Context]) -> State:
        '''Generate personalized greeting using runtime context and store.'''
        user_id = runtime.context.user_id  # (3)!
        name = "unknown_user"
        if runtime.store:
            if memory := runtime.store.get(("users",), user_id):
                name = memory.value["name"]
    
        response = f"Hello {name}! Nice to see you again."
        return {"response": response}
    
    graph = (
        StateGraph(state_schema=State, context_schema=Context)
        .add_node("personalized_greeting", personalized_greeting)
        .set_entry_point("personalized_greeting")
        .set_finish_point("personalized_greeting")
        .compile(store=store)
    )
    
    result = graph.invoke({}, context=Context(user_id="user_123"))
    print(result)
    # > {'response': 'Hello Alice! Nice to see you again.'}
    1. Define a schema for the runtime context.
    2. Create a store to persist memories and other information.
    3. Use the runtime context to access the user_id.
    server_infoServerInfo | None

    Static context for the graph run, like user_id, db_conn, etc.

    Can also be thought of as 'run dependencies'.

    Store for the graph run, enabling persistence and memory.

    Function that writes to the custom stream.

    The previous return value for the given thread.

    Only available with the functional API when a checkpointer is provided.

    Read-only execution information/metadata for the current node run.

    None before task preparation populates it.

    Metadata injected by LangGraph Server. None when running open-source LangGraph without LangSmith deployments.

    Merge two runtimes together.

    If a value is not provided in the other runtime, the value from the current runtime is used.

    Replace the runtime with a new runtime with the given overrides.

    Return a new runtime with selected execution_info fields replaced.