LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
LangChain
  • Universal
  • Hub
  • Node
  • Load
  • Serializable
  • Encoder Backed
  • File System
  • In Memory
LangChain Core
  • Agents
  • Caches
  • Base
  • Dispatch
  • Web
  • Manager
  • Promises
  • Chat History
  • Context
  • Base
  • Langsmith
  • Documents
  • Embeddings
  • Errors
  • Example Selectors
  • Indexing
  • Base
  • Chat Models
  • Llms
  • Profile
  • Load
  • Serializable
  • Memory
  • Messages
  • Tool
  • Output Parsers
  • Openai Functions
  • Openai Tools
  • Outputs
  • Prompt Values
  • Prompts
  • Retrievers
  • Document Compressors
  • Runnables
  • Graph
  • Singletons
  • Stores
  • Structured Query
  • Tools
  • Base
  • Console
  • Log Stream
  • Run Collector
  • Tracer Langchain
  • Stream
  • Async Caller
  • Chunk Array
  • Context
  • Env
  • Event Source Parse
  • Format
  • Function Calling
  • Hash
  • Json Patch
  • Json Schema
  • Math
  • Ssrf
  • Stream
  • Testing
  • Tiktoken
  • Types
  • Vectorstores
Text Splitters
MCP Adapters
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

LangChain
UniversalHubNodeLoadSerializableEncoder BackedFile SystemIn Memory
LangChain Core
AgentsCachesBaseDispatchWebManagerPromisesChat HistoryContextBaseLangsmithDocumentsEmbeddingsErrorsExample SelectorsIndexingBaseChat ModelsLlmsProfileLoadSerializableMemoryMessagesToolOutput ParsersOpenai FunctionsOpenai ToolsOutputsPrompt ValuesPromptsRetrieversDocument CompressorsRunnablesGraphSingletonsStoresStructured QueryToolsBaseConsoleLog StreamRun CollectorTracer LangchainStreamAsync CallerChunk ArrayContextEnvEvent Source ParseFormatFunction CallingHashJson PatchJson SchemaMathSsrfStreamTestingTiktokenTypesVectorstores
Text Splitters
MCP Adapters
Language
Theme
JavaScriptlangchainindexContextEdit
Interfaceā—Since v1.1

ContextEdit

Protocol describing a context editing strategy.

Implement this interface to create custom strategies for managing conversation context size. The apply method should modify the messages array in-place and return the updated token count.

Copy
interface ContextEdit

Example

Copy
import { HumanMessage, type ContextEdit, type BaseMessage  } from "langchain";

class RemoveOldHumanMessages implements ContextEdit {
  constructor(private keepRecent: number = 10) {}

  async apply({ messages, countTokens }) {
    // Check current token count
    const tokens = await countTokens(messages);

    // Remove old human messages if over limit, keeping the most recent ones
    if (tokens > 50000) {
      const humanMessages: number[] = [];

      // Find all human message indices
      for (let i = 0; i < messages.length; i++) {
        if (HumanMessage.isInstance(messages[i])) {
          humanMessages.push(i);
        }
      }

      // Remove old human messages (keep only the most recent N)
      const toRemove = humanMessages.slice(0, -this.keepRecent);
      for (let i = toRemove.length - 1; i >= 0; i--) {
        messages.splice(toRemove[i]!, 1);
      }
    }
  }
}

Methods

method
apply→ Promise<void>

Apply an edit to the message list, returning the new token count.

This method should:

  1. Check if editing is needed based on tokens parameter
  2. Modify the messages array in-place (if needed)
  3. Return the new token count after modifications
View source on GitHub