LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicchainsopenai_functionsqa_with_structurecreate_qa_with_structure_chain
    Function●Since v1.0Deprecated

    create_qa_with_structure_chain

    Create a question answering chain with structure.

    Create a question answering chain that returns an answer with sources based on schema.

    Copy
    create_qa_with_structure_chain(
      llm: BaseLanguageModel,
      schema: dict | type[BaseModel],
      output_parser: str = 'base',
      prompt: PromptTemplate | ChatPromptTemplate | None = None,
      verbose: bool = False
    ) -> LLMChain

    Parameters

    NameTypeDescription
    llm*BaseLanguageModel

    Language model to use for the chain.

    schema*dict | type[BaseModel]

    Pydantic schema to use for the output.

    output_parserstr
    Default:'base'

    Output parser to use. Should be one of 'pydantic' or 'base'.

    promptPromptTemplate | ChatPromptTemplate | None
    Default:None

    Optional prompt to use for the chain.

    verbosebool
    Default:False

    Whether to run the chain in verbose mode.

    View source on GitHub