LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicagentsopenai_functions_agentbasecreate_openai_functions_agent
    Function●Since v1.0

    create_openai_functions_agent

    Create an agent that uses OpenAI function calling.

    Copy
    create_openai_functions_agent(
      llm: BaseLanguageModel,
      tools: Sequence[BaseTool],
      prompt: ChatPromptTemplate
    ) -> Runnable

    Example:

    Creating an agent with no memory

    from langchain_openai import ChatOpenAI
    from langchain_classic.agents import (
        AgentExecutor,
        create_openai_functions_agent,
    )
    from langchain_classic import hub
    
    prompt = hub.pull("hwchase17/openai-functions-agent")
    model = ChatOpenAI()
    tools = ...
    
    agent = create_openai_functions_agent(model, tools, prompt)
    agent_executor = AgentExecutor(agent=agent, tools=tools)
    
    agent_executor.invoke({"input": "hi"})
    
    # Using with chat history
    from langchain_core.messages import AIMessage, HumanMessage
    
    agent_executor.invoke(
        {
            "input": "what's my name?",
            "chat_history": [
                HumanMessage(content="hi! my name is bob"),
                AIMessage(content="Hello Bob! How can I assist you today?"),
            ],
        }
    )

    Prompt:

    The agent prompt must have an agent_scratchpad key that is a MessagesPlaceholder. Intermediate agent actions and tool output messages will be passed in here.

    Here's an example:

    from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
    
    prompt = ChatPromptTemplate.from_messages(
        [
            ("system", "You are a helpful assistant"),
            MessagesPlaceholder("chat_history", optional=True),
            ("human", "{input}"),
            MessagesPlaceholder("agent_scratchpad"),
        ]
    )

    Used in Docs

    • Asknews integration
    • Composio integration
    • Fmp data integration
    • Infobip integration
    • Multion toolkit integration

    Parameters

    NameTypeDescription
    llm*BaseLanguageModel

    LLM to use as the agent. Should work with OpenAI function calling, so either be an OpenAI model that supports that or a wrapper of a different model that adds in equivalent support.

    tools*Sequence[BaseTool]

    Tools this agent has access to.

    prompt*ChatPromptTemplate

    The prompt to use. See Prompt section below for more.

    View source on GitHub