# create_qa_with_structure_chain

> **Function** in `langchain_classic`

📖 [View in docs](https://reference.langchain.com/python/langchain-classic/chains/openai_functions/qa_with_structure/create_qa_with_structure_chain)

Create a question answering chain with structure.

Create a question answering chain that returns an answer with sources
based on schema.

## Signature

```python
create_qa_with_structure_chain(
    llm: BaseLanguageModel,
    schema: dict | type[BaseModel],
    output_parser: str = 'base',
    prompt: PromptTemplate | ChatPromptTemplate | None = None,
    verbose: bool = False,
) -> LLMChain
```

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `llm` | `BaseLanguageModel` | Yes | Language model to use for the chain. |
| `schema` | `dict \| type[BaseModel]` | Yes | Pydantic schema to use for the output. |
| `output_parser` | `str` | No | Output parser to use. Should be one of `'pydantic'` or `'base'`. (default: `'base'`) |
| `prompt` | `PromptTemplate \| ChatPromptTemplate \| None` | No | Optional prompt to use for the chain. (default: `None`) |
| `verbose` | `bool` | No | Whether to run the chain in verbose mode. (default: `False`) |

## Returns

`LLMChain`

The question answering chain.

## ⚠️ Deprecated

Deprecated since version 0.2.13.

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/311675a517f51ec6c77454124293c58df517e952/libs/langchain/langchain_classic/chains/openai_functions/qa_with_structure.py#L30)