LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    Standard Tests
    Text Splitters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    Standard Tests
    Text Splitters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Language
    Theme
    Pythonlangchain-classicagentsopenai_functions_agentbasecreate_openai_functions_agent
    Functionā—Since v1.0

    create_openai_functions_agent

    Copy
    create_openai_functions_agent(
      llm: BaseLanguageModel,
      tools: Sequence[BaseTool],
      prompt: ChatPromptTemplate
    )

    Used in Docs

    • Asknews integration
    • Composio integration
    • Fmp data integration
    • Infobip integration
    • Multion toolkit integration
    View source on GitHub
    ->
    Runnable

    Parameters

    NameTypeDescription
    llm*BaseLanguageModel
    tools*Sequence[BaseTool]
    prompt*ChatPromptTemplate

    Create an agent that uses OpenAI function calling.

    LLM to use as the agent. Should work with OpenAI function calling, so either be an OpenAI model that supports that or a wrapper of a different model that adds in equivalent support.

    Tools this agent has access to.

    The prompt to use. See Prompt section below for more.

    Example:

    Creating an agent with no memory

    from langchain_openai import ChatOpenAI
    from langchain_classic.agents import (
        AgentExecutor,
        create_openai_functions_agent,
    )
    from langchain_classic import hub
    
    prompt = hub.pull("hwchase17/openai-functions-agent")
    model = ChatOpenAI()
    tools = ...
    
    agent = create_openai_functions_agent(model, tools, prompt)
    agent_executor = AgentExecutor(agent=agent, tools=tools)
    
    agent_executor.invoke({"input": "hi"})
    
    # Using with chat history
    from langchain_core.messages import AIMessage, HumanMessage
    
    agent_executor.invoke(
        {
            "input": "what's my name?",
            "chat_history": [
                HumanMessage(content="hi! my name is bob"),
                AIMessage(content="Hello Bob! How can I assist you today?"),
            ],
        }
    )

    Prompt:

    The agent prompt must have an agent_scratchpad key that is a MessagesPlaceholder. Intermediate agent actions and tool output messages will be passed in here.

    Here's an example:

    from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
    
    prompt = ChatPromptTemplate.from_messages(
        [
            ("system", "You are a helpful assistant"),
            MessagesPlaceholder("chat_history", optional=True),
            ("human", "{input}"),
            MessagesPlaceholder("agent_scratchpad"),
        ]
    )