Understanding stacks
Learning how to switch the infrastructure backend of your code.
Now that we have ZenML deployed, we can take the next steps in making sure that our machine learning workflows are production-ready. As you were running your first pipelines, you might have already noticed the term stack
in the logs and on the dashboard.
A stack
is the configuration of tools and infrastructure that your pipelines can run on. When you run ZenML code without configuring a stack, the pipeline will run on the so-called default
stack.
Separation of code from configuration and infrastructure
As visualized in the diagram above, there are two separate domains that are connected through ZenML. The left side shows the code domain. The user's Python code is translated into a ZenML pipeline. On the right side, you can see the infrastructure domain, in this case, an instance of the default
stack. By separating these two domains, it is easy to switch the environment that the pipeline runs on without making any changes in the code. It also allows domain experts to write code/configure infrastructure without worrying about the other domain.
The default
stack
default
stackzenml stack describe
lets you find out details about your active stack:
zenml stack list
lets you see all stacks that are registered in your zenml deployment.
As you can see a stack can be active on your client. This simply means that any pipeline you run will be using the active stack as its environment.
Components of a stack
As you can see in the section above, a stack consists of multiple components. All stacks have at minimum an orchestrator and an artifact store.
Orchestrator
The orchestrator is responsible for executing the pipeline code. In the simplest case, this will be a simple Python thread on your machine. Let's explore this default orchestrator.
zenml orchestrator list
lets you see all orchestrators that are registered in your zenml deployment.
Artifact store
The artifact store is responsible for persisting the step outputs. As we learned in the previous section, the step outputs are not passed along in memory, rather the outputs of each step are stored in the artifact store and then loaded from there when the next step needs them. By default this will also be on your own machine:
zenml artifact-store list
lets you see all artifact stores that are registered in your zenml deployment.
Other stack components
There are many more components that you can add to your stacks, like experiment trackers, model deployers, and more. You can see all supported stack component types in a single table view here
Perhaps the most important stack component after the orchestrator and the artifact store is the container registry. A container registry stores all your containerized images, which hold all your code and the environment needed to execute them. We will learn more about them in the next section!
Registering a stack
Just to illustrate how to interact with stacks, let's create an alternate local stack. We start by first creating a local artifact store.
Create an artifact store
Let's understand the individual parts of this command:
artifact-store
: This describes the top-level group, to find other stack components simply runzenml --help
register
: Here we want to register a new component, instead, we could alsoupdate
,delete
and morezenml artifact-store --help
will give you all possibilitiesmy_artifact_store
: This is the unique name that the stack component will have.--flavor=local
: A flavor is a possible implementation for a stack component. So in the case of an artifact store, this could be an s3-bucket or a local filesystem. You can find out all possibilities withzenml artifact-store flavor --list
This will be the output that you can expect from the command above.
To see the new artifact store that you just registered, just run:
Create a local stack
With the artifact store created, we can now create a new stack with this artifact store.
stack
: This is the CLI group that enables interactions with the stacksregister
: Here we want to register a new stack. Explore other operations withzenml stack --help
.a_new_local_stack
: This is the unique name that the stack will have.--orchestrator
or-o
are used to specify which orchestrator to use for the stack--artifact-store
or-a
are used to specify which artifact store to use for the stack
The output for the command should look something like this:
You can inspect the stack with the following command:
Which will give you an output like this:
Switch stacks with our VS Code extension
If you are using our VS Code extension, you can easily view and switch your stacks by opening the sidebar (click on the ZenML icon). You can then click on the stack you want to switch to as well as view the stack components it's made up of.
Run a pipeline on the new local stack
Let's use the pipeline in our starter project from the previous guide to see it in action.
If you have not already, clone the starter template:
To run a pipeline using the new stack:
Set the stack as active on your client
Run your pipeline code:
Keep this code handy as we'll be using it in the next chapters!
Last updated