LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • MCP Adapters
    • Overview
    • Agents
    • Callbacks
    • Chains
    • Chat models
    • Embeddings
    • Evaluation
    • Globals
    • Hub
    • Memory
    • Output parsers
    • Retrievers
    • Runnables
    • LangSmith
    • Storage
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    MCP Adapters
    OverviewAgentsCallbacksChainsChat modelsEmbeddingsEvaluationGlobalsHubMemoryOutput parsersRetrieversRunnablesLangSmithStorage
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-classicagentsself_ask_with_searchoutput_parser
    Module●Since v1.0

    output_parser

    Classes

    class
    SelfAskOutputParser

    Parses self-ask style LLM calls.

    Expects output to be in one of two formats.

    If the output signals that an action should be taken, should be in the below format. This will result in an AgentAction being returned.

    Thoughts go here...
    Follow up: what is the temperature in SF?
    

    If the output signals that a final answer should be given, should be in the below format. This will result in an AgentFinish being returned.

    Thoughts go here...
    So the final answer is: The temperature is 100 degrees
    
    View source on GitHub