LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicagentsagent_toolkitsconversational_retrievalopenai_functionscreate_conversational_retrieval_agent
    Function●Since v1.0

    create_conversational_retrieval_agent

    A convenience method for creating a conversational retrieval agent.

    Copy
    create_conversational_retrieval_agent(
      llm: BaseLanguageModel,
      tools: list[BaseTool],
      remember_intermediate_steps: bool = True,
      memory_key: str = 'chat_history',
      system_message: SystemMessage | None = None,
      verbose: bool = False,
      max_token_limit: int = 2000,
      **kwargs: Any = {}
    ) -> AgentExecutor

    Used in Docs

    • Cogniswitch toolkit integration

    Parameters

    NameTypeDescription
    llm*BaseLanguageModel

    The language model to use, should be ChatOpenAI

    tools*list[BaseTool]

    A list of tools the agent has access to

    remember_intermediate_stepsbool
    Default:True

    Whether the agent should remember intermediate steps or not. Intermediate steps refer to prior action/observation pairs from previous questions. The benefit of remembering these is if there is relevant information in there, the agent can use it to answer follow up questions. The downside is it will take up more tokens.

    memory_keystr
    Default:'chat_history'

    The name of the memory key in the prompt.

    system_messageSystemMessage | None
    Default:None

    The system message to use. By default, a basic one will be used.

    verbosebool
    Default:False

    Whether or not the final AgentExecutor should be verbose or not.

    max_token_limitint
    Default:2000

    The max number of tokens to keep around in memory.

    **kwargsAny
    Default:{}

    Additional keyword arguments to pass to the AgentExecutor.

    View source on GitHub