astream_v2(
self,
input: LanguageModelInput,
config: RunnableConfig | None = None,
*,
stop| Name | Type | Description |
|---|---|---|
input* | LanguageModelInput | The model input. |
config | RunnableConfig | None | Default: NoneOptional runnable config. |
stop | list[str] | None | Default: None |
**kwargs | Any | Default: {} |
Async variant of stream_v2.
Returns an AsyncChatModelStream whose projections are
async-iterable and awaitable.
This API is experimental and may change.
The assembled message's content is always a list of v1
content blocks, regardless of the model's output_version
attribute — see stream_v2 for the full rationale.
Optional list of stop words.
Additional keyword arguments passed to the model.