LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Caches
    • Callbacks
    • Documents
    • Document loaders
    • Embeddings
    • Exceptions
    • Language models
    • Serialization
    • Output parsers
    • Prompts
    • Rate limiters
    • Retrievers
    • Runnables
    • Utilities
    • Vector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewCachesCallbacksDocumentsDocument loadersEmbeddingsExceptionsLanguage modelsSerializationOutput parsersPromptsRate limitersRetrieversRunnablesUtilitiesVector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-corecachesInMemoryCache
    Class●Since v0.1

    InMemoryCache

    Cache that stores things in memory.

    Copy
    InMemoryCache(
        self,
        *,
        maxsize: int | None = None,
    )

    Bases

    BaseCache

    Example:

    from langchain_core.caches import InMemoryCache
    from langchain_core.outputs import Generation
    
    # Initialize cache
    cache = InMemoryCache()
    
    # Update cache
    cache.update(
        prompt="What is the capital of France?",
        llm_string="model='gpt-3.5-turbo', temperature=0.1",
        return_val=[Generation(text="Paris")],
    )
    
    # Lookup cache
    result = cache.lookup(
        prompt="What is the capital of France?",
        llm_string="model='gpt-3.5-turbo', temperature=0.1",
    )
    # result is [Generation(text="Paris")]

    Used in Docs

    • Graph API overview
    • Use the functional API
    • Use the graph API

    Parameters

    NameTypeDescription
    maxsizeint | None
    Default:None

    The maximum number of items to store in the cache.

    If None, the cache has no maximum size.

    If the cache exceeds the maximum size, the oldest items are removed.

    Constructors

    constructor
    __init__
    NameType
    maxsizeint | None

    Methods

    method
    lookup

    Look up based on prompt and llm_string.

    method
    update

    Update cache based on prompt and llm_string.

    method
    clear

    Clear cache.

    method
    alookup

    Async look up based on prompt and llm_string.

    method
    aupdate

    Async update cache based on prompt and llm_string.

    method
    aclear

    Async clear cache.

    View source on GitHub