Ask a question to get started
Enter to sendā¢Shift+Enter new line
Save the prompts in memory when an LLM starts.
on_chat_model_start( self, serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any = {} ) -> Any