ConstThe delta object from an OpenAI streaming chunk containing incremental message updates. May include content, role, tool_calls, function_call, audio, etc.
The complete raw ChatCompletionChunk response from OpenAI, containing metadata like model info, usage stats, and the delta
Optional flag to include the raw OpenAI response in the message chunk's additional_kwargs. Useful for debugging or accessing provider-specific data
Optional default role to use if the delta doesn't specify one. Typically used to maintain role consistency across chunks in a streaming response
A BaseMessageChunk subclass appropriate for the message role:
Basic streaming text chunk:
const chunk = convertCompletionsDeltaToBaseMessageChunk({
delta: { role: "assistant", content: "Hello" },
rawResponse: { id: "chatcmpl-123", model: "gpt-4", ... }
});
// Returns: AIMessageChunk with content "Hello"
Streaming chunk with tool call:
const chunk = convertCompletionsDeltaToBaseMessageChunk({
delta: {
role: "assistant",
tool_calls: [{
index: 0,
id: "call_123",
function: { name: "get_weather", arguments: '{"location":' }
}]
},
rawResponse: { id: "chatcmpl-123", ... }
});
// Returns: AIMessageChunk with tool_call_chunks containing partial tool call data
Converts an OpenAI Chat Completions API delta (streaming chunk) to a LangChain BaseMessageChunk.
This converter is used during streaming responses to transform incremental updates from OpenAI's Chat Completions API into LangChain message chunks. It handles various message types, tool calls, function calls, audio content, and role-specific message chunk creation.