Overview
Overview of categories of MLOps components and third-party integrations.
Last updated
Overview of categories of MLOps components and third-party integrations.
Last updated
If you are new to the world of MLOps, it is often daunting to be immediately faced with a sea of tools that seemingly all promise and do the same things. It is useful in this case to try to categorize tools in various groups in order to understand their value in your toolchain in a more precise manner.
ZenML tackles this problem by introducing the concept of . These stack components represent categories, each of which has a particular function in your MLOps pipeline. ZenML realizes these stack components as base abstractions that standardize the entire workflow for your team. In order to then realize the benefit, one can write a concrete implementation of the , or use one of the many built-in that implement these abstractions for you.
Here is a full list of all stack components currently supported in ZenML, with a description of the role of that component in the MLOps process:
Each pipeline run that you execute with ZenML will require a stack and each stack will be required to include at least an orchestrator and an artifact store. Apart from these two, the other components are optional and to be added as your pipeline evolves in MLOps maturity.
Categorizing the MLOps stack is a good way to write abstractions for an MLOps pipeline and standardize your processes. But ZenML goes further and also provides concrete implementations of these categories by integrating with various tools for each category. Once code is organized into a ZenML pipeline, you can supercharge your ML workflows with the best-in-class solutions from various MLOps areas.
There are lots of moving parts for all the MLOps tooling and infrastructure you require for ML in production and ZenML brings them all together and enables you to manage them in one place. This also allows you to delay the decision of which MLOps tool to use in your stack as you have no vendor lock-in with ZenML and can easily switch out tools as soon as your requirements change.
Under the hood, this simply installs the preferred versions of all integrations using pip, i.e., it executes in a sub-process call:
The -y
flag confirms all pip install
commands without asking you for
You can run zenml integration --help
to see a full list of CLI commands that ZenML provides for interacting with integrations.
Note, that you can also install your dependencies directly, but please note that there is no guarantee that ZenML internals with work with any arbitrary version of any external library.
uv
for package installationYou can upgrade all integrations to their latest possible version using:
The -y
flag confirms all pip install --upgrade
commands without asking you for confirmation.
If no integrations are specified, all installed integrations will be upgraded.
You can take control of how ZenML behaves by creating your own components. This is done by writing custom component flavors
. To learn more, head over to , or read more specialized guides for specific component types (e.g. the ).
For example, you can orchestrate your ML pipeline workflows using or , track experiments using or , and transition seamlessly from a local to a deployed model on Kubernetes using .
We have a that indexes all supported ZenML integrations and their categories.
Another easy way of seeing a list of integrations is to see the list of directories in the on our GitHub.
Before you can use integrations, you first need to install them using zenml integration install
, e.g., you can install , , and , using:
You can use as a package manager if you want. Simply pass the --uv
flag to the zenml integration ...
command and it'll use uv
for installation, upgrades and uninstalls. Note that uv
must be installed for this to work. This is an experimental option that we've added for users wishing to use uv
but given that it is relatively new as an option there might be certain packages that don't work well with uv
.
Full documentation for how it works with PyTorch can be found on Astral's docs website . It covers some of the particular gotchas and details you might need to know.
There are countless tools in the ML / MLOps field. We have made an initial prioritization of which tools to support with integrations that are visible on our public .
We also welcome community contributions. Check our and for more details on how to best contribute to new integrations.
Type of Stack Component | Description |
Orchestrating the runs of your pipeline |
Storage for the artifacts created by your pipelines |
Store for your containers |
Data and model validation |
Tracking your ML experiments |
Services/platforms responsible for online model serving |
Execution of individual steps in specialized runtime environments |
Sending alerts through specified channels |
Builds container images. |
Labeling and annotating data |
Manage and interact with ML Models |
Management of your data/features |