LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Caches
    • Callbacks
    • Documents
    • Document loaders
    • Embeddings
    • Exceptions
    • Language models
    • Serialization
    • Output parsers
    • Prompts
    • Rate limiters
    • Retrievers
    • Runnables
    • Utilities
    • Vector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewCachesCallbacksDocumentsDocument loadersEmbeddingsExceptionsLanguage modelsSerializationOutput parsersPromptsRate limitersRetrieversRunnablesUtilitiesVector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-coreutilsmustachetokenize
    Function●Since v0.1

    tokenize

    Tokenize a mustache template.

    Tokenizes a mustache template in a generator fashion, using file-like objects. It also accepts a string containing the template.

    Copy
    tokenize(
      template: str,
      def_ldel: str = '{{',
        def_rdel: str = '}}'
    ) -> Iterator[tuple[str, str]]

    Parameters

    NameTypeDescription
    template*str

    a file-like object, or a string of a mustache template

    def_ldelstr
    Default:'{{'

    The default left delimiter ('{{' by default, as in spec compliant mustache)

    def_rdelstr
    Default:'}}'

    The default right delimiter ('}}' by default, as in spec compliant mustache)

    View source on GitHub