Called when an LLM/ChatModel in streaming mode produces a new token
handleLLMNewToken(
token: string,
idx: NewTokenIndices,
runId: string,
_parentRunId: string,
_tags: string[],
fields: HandleLLMNewTokenCallbackFields
): Promise<Run>| Name | Type | Description |
|---|---|---|
token* | string | |
idx* | NewTokenIndices | |
runId* | string | |
_parentRunId | string | |
_tags | string[] | |
fields | HandleLLMNewTokenCallbackFields |