Ask a question to get started
Enter to sendā¢Shift+Enter new line
Stream agent responses following LangChain Runnable interface.
stream( self, input: Union[str, Dict[str, Any]], config: Optional[RunnableConfig] = None, **kwargs: Any = {} ) -> Iterator[str]
input
Union[str, Dict[str, Any]]
Either a string query or dict with 'input'/'query' and optional parameters
config
Optional[RunnableConfig]
None
Optional configuration for the run
**kwargs
Any
{}
Additional parameters (temperature, max_tokens, etc.)