Create a question answering chain with structure.
Create a question answering chain that returns an answer with sources based on schema.
create_qa_with_structure_chain(
llm: BaseLanguageModel,
schema: dict | type[BaseModel],
output_parser: str = 'base',
prompt: PromptTemplate | ChatPromptTemplate | None = None,
verbose: bool = False
) -> LLMChain| Name | Type | Description |
|---|---|---|
llm* | BaseLanguageModel | Language model to use for the chain. |
schema* | dict | type[BaseModel] | Pydantic schema to use for the output. |
output_parser | str | Default: 'base'Output parser to use. Should be one of |
prompt | PromptTemplate | ChatPromptTemplate | None | Default: NoneOptional prompt to use for the chain. |
verbose | bool | Default: FalseWhether to run the chain in verbose mode. |