langchain.js
    Preparing search index...

    Variable convertCompletionsDeltaToBaseMessageChunkConst

    convertCompletionsDeltaToBaseMessageChunk: BaseDynamicToolInput<
        {
            defaultRole?: OpenAIClient.Chat.ChatCompletionRole;
            delta: Record<string, any>;
            includeRawResponse?: boolean;
            rawResponse: OpenAIClient.Chat.Completions.ChatCompletionChunk;
        },
        BaseDynamicToolInput,
    > = ...

    Converts an OpenAI Chat Completions API delta (streaming chunk) to a LangChain BaseMessageChunk.

    This converter is used during streaming responses to transform incremental updates from OpenAI's Chat Completions API into LangChain message chunks. It handles various message types, tool calls, function calls, audio content, and role-specific message chunk creation.

    Conversion parameters

    The delta object from an OpenAI streaming chunk containing incremental message updates. May include content, role, tool_calls, function_call, audio, etc.

    The complete raw ChatCompletionChunk response from OpenAI, containing metadata like model info, usage stats, and the delta

    Optional flag to include the raw OpenAI response in the message chunk's additional_kwargs. Useful for debugging or accessing provider-specific data

    Optional default role to use if the delta doesn't specify one. Typically used to maintain role consistency across chunks in a streaming response

    A BaseMessageChunk subclass appropriate for the message role:

    • HumanMessageChunk for "user" role
    • AIMessageChunk for "assistant" role (includes tool call chunks)
    • SystemMessageChunk for "system" or "developer" roles
    • FunctionMessageChunk for "function" role
    • ToolMessageChunk for "tool" role
    • ChatMessageChunk for any other role

    Basic streaming text chunk:

    const chunk = convertCompletionsDeltaToBaseMessageChunk({
    delta: { role: "assistant", content: "Hello" },
    rawResponse: { id: "chatcmpl-123", model: "gpt-4", ... }
    });
    // Returns: AIMessageChunk with content "Hello"

    Streaming chunk with tool call:

    const chunk = convertCompletionsDeltaToBaseMessageChunk({
    delta: {
    role: "assistant",
    tool_calls: [{
    index: 0,
    id: "call_123",
    function: { name: "get_weather", arguments: '{"location":' }
    }]
    },
    rawResponse: { id: "chatcmpl-123", ... }
    });
    // Returns: AIMessageChunk with tool_call_chunks containing partial tool call data
    • Tool calls are converted to ToolCallChunk objects with incremental data
    • Audio content includes the chunk index from the raw response
    • The "developer" role is mapped to SystemMessageChunk with a special marker
    • Response metadata includes model provider info and usage statistics
    • Function calls and tool calls are stored in additional_kwargs for compatibility