Create a question answering chain that returns an answer with sources.
create_qa_with_sources_chain(
llm: BaseLanguageModel,
verbose: bool = False,
**kwargs: Any = {}
) -> LLMChain| Name | Type | Description |
|---|---|---|
llm* | BaseLanguageModel | Language model to use for the chain. |
verbose | bool | Default: FalseWhether to print the details of the chain |
**kwargs | Any | Default: {}Keyword arguments to pass to |