LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Caches
    • Callbacks
    • Documents
    • Document loaders
    • Embeddings
    • Exceptions
    • Language models
    • Serialization
    • Output parsers
    • Prompts
    • Rate limiters
    • Retrievers
    • Runnables
    • Utilities
    • Vector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewCachesCallbacksDocumentsDocument loadersEmbeddingsExceptionsLanguage modelsSerializationOutput parsersPromptsRate limitersRetrieversRunnablesUtilitiesVector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-corelanguage_modelsfake_chat_models
    Module●Since v0.1

    fake_chat_models

    Classes

    View source on GitHub
    class
    AsyncCallbackManagerForLLMRun

    Async callback manager for LLM run.

    class
    CallbackManagerForLLMRun

    Callback manager for LLM run.

    class
    BaseChatModel

    Base class for chat models.

    class
    SimpleChatModel

    Simplified implementation for a chat model to inherit from.

    Note

    This implementation is primarily here for backwards compatibility. For new implementations, please use BaseChatModel directly.

    class
    AIMessage

    Message from an AI.

    An AIMessage is returned from a chat model as a response to a prompt.

    This message represents the output of the model and consists of both the raw output as returned by the model and standardized fields (e.g., tool calls, usage metadata) added by the LangChain framework.

    class
    AIMessageChunk

    Message chunk from an AI (yielded when streaming).

    class
    BaseMessage

    Base abstract message class.

    Messages are the inputs and outputs of a chat model.

    Examples include HumanMessage, AIMessage, and SystemMessage.

    class
    ChatGeneration

    A single chat generation output.

    A subclass of Generation that represents the response from a chat model that generates chat messages.

    The message attribute is a structured representation of the chat message. Most of the time, the message will be of type AIMessage.

    Users working with chat models will usually access information via either AIMessage (returned from runnable interfaces) or LLMResult (available via callbacks).

    class
    ChatGenerationChunk

    ChatGeneration chunk.

    ChatGeneration chunks can be concatenated with other ChatGeneration chunks.

    class
    ChatResult

    Use to represent the result of a chat model call with a single prompt.

    This container is used internally by some implementations of chat model, it will eventually be mapped to a more general LLMResult object, and then projected into an AIMessage object.

    LangChain users working with chat models will usually access information via AIMessage (returned from runnable interfaces) or LLMResult (available via callbacks). Please refer the AIMessage and LLMResult schema documentation for more information.

    class
    RunnableConfig

    Configuration for a Runnable.

    Note

    Custom values

    The TypedDict has total=False set intentionally to:

    • Allow partial configs to be created and merged together via merge_configs
    • Support config propagation from parent to child runnables via var_child_runnable_config (a ContextVar that automatically passes config down the call stack without explicit parameter passing), where configs are merged rather than replaced
    Example
    # Parent sets tags
    chain.invoke(input, config={"tags": ["parent"]})
    # Child automatically inherits and can add:
    # ensure_config({"tags": ["child"]}) -> {"tags": ["parent", "child"]}
    class
    FakeMessagesListChatModel

    Fake chat model for testing purposes.

    class
    FakeListChatModelError

    Fake error for testing purposes.

    class
    FakeListChatModel

    Fake chat model for testing purposes.

    class
    FakeChatModel

    Fake Chat Model wrapper for testing purposes.

    class
    GenericFakeChatModel

    Generic fake chat model that can be used to test the chat model interface.

    • Chat model should be usable in both sync and async tests
    • Invokes on_llm_new_token to allow for testing of callback related code for new tokens.
    • Includes logic to break messages into message chunk to facilitate testing of streaming.
    class
    ParrotFakeChatModel

    Generic fake chat model that can be used to test the chat model interface.

    • Chat model should be usable in both sync and async tests

    Fake chat models for testing purposes.