Create a cron run.
create(
self,
assistant_id: str,
*,
schedule: str,
input: Input | None = None,
metadata: Mapping[str, Any] | None = None,
config: Config | None = None,
context: Context | None = None,
checkpoint_during: bool | None = None,
interrupt_before: All | list[str] | None = None,
interrupt_after: All | list[str] | None = None,
webhook: str | None = None,
on_run_completed: OnCompletionBehavior | None = None,
multitask_strategy: str | None = None,
end_time: datetime | None = None,
enabled: bool | None = None,
stream_mode: StreamMode | Sequence[StreamMode] | None = None,
stream_subgraphs: bool | None = None,
stream_resumable: bool | None = None,
durability: Durability | None = None,
headers: Mapping[str, str] | None = None,
params: QueryParamTypes | None = None
) -> Runclient = get_client(url="http://localhost:2024")
cron_run = client.crons.create(
assistant_id="agent",
schedule="27 15 * * *",
input={"messages": [{"role": "user", "content": "hello!"}]},
metadata={"name":"my_run"},
context={"model_name": "openai"},
interrupt_before=["node_to_stop_before_1","node_to_stop_before_2"],
interrupt_after=["node_to_stop_after_1","node_to_stop_after_2"],
webhook="https://my.fake.webhook.com",
multitask_strategy="interrupt",
enabled=True,
)| Name | Type | Description |
|---|---|---|
assistant_id* | str | The assistant ID or graph name to use for the cron job. If using graph name, will default to first assistant created from that graph. |
schedule* | str | The cron schedule to execute this job on. Schedules are interpreted in UTC. |
input | Input | None | Default: NoneThe input to the graph. |
metadata | Mapping[str, Any] | None | Default: NoneMetadata to assign to the cron job runs. |
config | Config | None | Default: NoneThe configuration for the assistant. |
context | Context | None | Default: NoneStatic context to add to the assistant. |
checkpoint_during | bool | None | Default: None(deprecated) Whether to checkpoint during the run (or only at the end/interruption). |
interrupt_before | All | list[str] | None | Default: NoneNodes to interrupt immediately before they get executed. |
interrupt_after | All | list[str] | None | Default: NoneNodes to Nodes to interrupt immediately after they get executed. |
webhook | str | None | Default: NoneWebhook to call after LangGraph API call is done. |
on_run_completed | OnCompletionBehavior | None | Default: NoneWhat to do with the thread after the run completes. Must be one of 'delete' (default) or 'keep'. 'delete' removes the thread after execution. 'keep' creates a new thread for each execution but does not clean them up. Clients are responsible for cleaning up kept threads. |
multitask_strategy | str | None | Default: NoneMultitask strategy to use. Must be one of 'reject', 'interrupt', 'rollback', or 'enqueue'. |
end_time | datetime | None | Default: NoneThe time to stop running the cron job. If not provided, the cron job will run indefinitely. |
enabled | bool | None | Default: NoneWhether the cron job is enabled or not. |
stream_mode | StreamMode | Sequence[StreamMode] | None | Default: NoneThe stream mode(s) to use. |
stream_subgraphs | bool | None | Default: NoneWhether to stream output from subgraphs. |
stream_resumable | bool | None | Default: NoneWhether to persist the stream chunks in order to resume the stream later. |
durability | Durability | None | Default: NoneDurability level for the run. Must be one of 'sync', 'async', or 'exit'. "async" means checkpoints are persisted async while next graph step executes, replaces checkpoint_during=True "sync" means checkpoints are persisted sync after graph step executes, replaces checkpoint_during=False "exit" means checkpoints are only persisted when the run exits, does not save intermediate steps |
headers | Mapping[str, str] | None | Default: NoneOptional custom headers to include with the request. |
params | QueryParamTypes | None | Default: NoneOptional query parameters to include with the request. |