Ask a question to get started
Enter to send•Shift+Enter new line
Start a trace for a (non-chat model) LLM run.
on_llm_start( self, serialized: dict[str, Any], prompts: list[str], *, run_id: UUID, tags: list[str] | None = None, parent_run_id: UUID | None = None, metadata: dict[str, Any] | None = None, name: str | None = None, **kwargs: Any = {} ) -> None