LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicindexesvectorstoreVectorStoreIndexWrapperquery_with_sources
    Method●Since v1.0

    query_with_sources

    Query the VectorStore and retrieve the answer along with sources.

    Copy
    query_with_sources(
      self,
      question: str,
      llm: BaseLanguageModel | None = None,
      retriever_kwargs: dict[str, Any] | None = None,
      **kwargs: Any = {}
    ) -> dict

    Parameters

    NameTypeDescription
    question*str

    The question or prompt to query.

    llmBaseLanguageModel | None
    Default:None

    The language model to use. Must not be None.

    retriever_kwargsdict[str, Any] | None
    Default:None

    Optional keyword arguments for the retriever.

    **kwargsAny
    Default:{}

    Additional keyword arguments forwarded to the chain.

    View source on GitHub