# format_to_openai_function_messages

> **Function** in `langchain_classic`

📖 [View in docs](https://reference.langchain.com/python/langchain-classic/agents/format_scratchpad/openai_functions/format_to_openai_function_messages)

Convert (AgentAction, tool output) tuples into FunctionMessages.

## Signature

```python
format_to_openai_function_messages(
    intermediate_steps: Sequence[tuple[AgentAction, str]],
) -> list[BaseMessage]
```

## Description

Raises:
ValueError: if the observation cannot be converted to a string.

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `intermediate_steps` | `Sequence[tuple[AgentAction, str]]` | Yes | Steps the LLM has taken to date, along with observations |

## Returns

`list[BaseMessage]`

list of messages to send to the LLM for the next prediction

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/02991cb4cf2063d51a07268edafb05fe53de1826/libs/langchain/langchain_classic/agents/format_scratchpad/openai_functions.py#L68)