LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
    • Overview
    • Caches
    • Callbacks
    • Documents
    • Document loaders
    • Embeddings
    • Exceptions
    • Language models
    • Serialization
    • Output parsers
    • Prompts
    • Rate limiters
    • Retrievers
    • Runnables
    • Utilities
    • Vector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    OverviewCachesCallbacksDocumentsDocument loadersEmbeddingsExceptionsLanguage modelsSerializationOutput parsersPromptsRate limitersRetrieversRunnablesUtilitiesVector stores
    MCP Adapters
    Standard Tests
    Text Splitters
    Language
    Theme
    Pythonlangchain-corepromptsstructuredStructuredPromptfrom_messages_and_schema
    Method●Since v0.1

    from_messages_and_schema

    Create a chat prompt template from a variety of message formats.

    Copy
    from_messages_and_schema(
      cls,
      messages: Sequence[MessageLikeRepresentation],
      schema: dict | type,
      **kwargs: Any = {}
    ) -> ChatPromptTemplate

    Parameters

    NameTypeDescription
    messages*Sequence[MessageLikeRepresentation]

    Sequence of message representations.

    A message can be represented using the following formats:

    1. BaseMessagePromptTemplate
    2. BaseMessage
    3. 2-tuple of (message type, template); e.g., ("human", "{user_input}")
    4. 2-tuple of (message class, template)
    5. A string which is shorthand for ("human", template); e.g., "{user_input}"
    schema*dict | type

    A dictionary representation of function call, or a Pydantic model.

    **kwargsAny
    Default:{}

    Any additional kwargs to pass through to ChatModel.with_structured_output(schema, **kwargs).

    View source on GitHub