We're currently reworking the Airflow orchestrator to make sure it works not only locally but also with Airflow instances deployed on cloud infrastructure.
How to deploy it
The Airflow orchestrator works without any additional infrastructure setup.
How to use it
To use the Airflow orchestrator, we need:
The ZenML airflow integration installed. If you haven't done so, run
zenml integration install airflow
We can then register the orchestrator and use it in our active stack:
zenml orchestrator register <NAME>\
# Add the orchestrator to the active stack
zenml stack update -o <NAME>
Once the orchestrator is part of the active stack, we can provision all required local resources by running:
zenml stack up
This command will start up an Airflow server on your local machine that's running in the same Python environment that you used to provision it. When it is finished, it will print a username and password which you can use to login to the Airflow UI here.
You can now run any ZenML pipeline using the Airflow orchestrator:
A concrete example of using the Airflow orchestrator can be found here.
For more information and a full list of configurable attributes of the Airflow orchestrator, check out the API Docs.