LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicevaluationembedding_distancebasePairwiseEmbeddingDistanceEvalChain
    Class●Since v1.0

    PairwiseEmbeddingDistanceEvalChain

    Use embedding distances to score semantic difference between two predictions.

    Examples:

    chain = PairwiseEmbeddingDistanceEvalChain() result = chain.evaluate_string_pairs(prediction="Hello", prediction_b="Hi") print(result) {'score': 0.5}

    Copy
    PairwiseEmbeddingDistanceEvalChain()

    Bases

    _EmbeddingDistanceChainMixinPairwiseStringEvaluator

    Attributes

    attribute
    input_keys: list[str]

    Return the input keys of the chain.

    attribute
    evaluation_name: str

    Return the evaluation name.

    Inherited fromPairwiseStringEvaluator

    Methods

    Mevaluate_string_pairs
    —

    Evaluate the output string pairs.

    Maevaluate_string_pairs
    —

    Asynchronously evaluate the output string pairs.

    View source on GitHub