Skip to content

Workflow

Decorator that turns a function into a Datatailr DAG workflow.

The decorated function should compose @task-decorated calls. Task dependencies are inferred automatically from the data flow between function calls.

All parameters supplied to this decorator set the default values for every task in the workflow. Individual tasks can override these via their own @task arguments.

Parameters:

Name Type Description Default
name Optional[str]

Display name of the workflow. Defaults to the decorated function name, title-cased.

None
schedule Optional[Schedule]

An optional Schedule for recurring execution.

None
image Optional[Image]

Pre-configured container Image applied to all tasks.

None
run_as Optional[Union[str, User]]

User or username under which the workflow runs.

None
resources Resources

Default Resources (CPU / memory) for each task.

Resources(memory='100m', cpu=1)
acl Optional[ACL]

Access control list for the workflow.

None
python_version str

Python version for the container images.

'auto'
python_requirements str | List[str]

Python dependencies (see Image).

''
build_script_pre str

Dockerfile commands before pip install.

''
build_script_post str

Dockerfile commands after pip install.

''
env_vars Dict[str, str | int | float | bool]

Environment variables passed to every task container.

{}
fail_after timedelta | str | None

Maximum wall-clock duration before the workflow is marked as failed (e.g. timedelta(hours=2) or "2h").

None
expire_after timedelta | str | None

Duration after which a completed workflow's resources are cleaned up.

None

Returns:

Type Description

A wrapper function. Calling it deploys the workflow to the

platform. Pass local_run=True to execute locally, or

to_json=True to obtain the JSON representation without

deploying.