LogoLogo
ProductResourcesGitHubStart free
  • Documentation
  • Learn
  • ZenML Pro
  • Stacks
  • API Reference
  • SDK Reference
  • Overview
  • Integrations
  • Stack Components
    • Orchestrators
      • Local Orchestrator
      • Local Docker Orchestrator
      • Kubeflow Orchestrator
      • Kubernetes Orchestrator
      • Google Cloud VertexAI Orchestrator
      • AWS Sagemaker Orchestrator
      • AzureML Orchestrator
      • Databricks Orchestrator
      • Tekton Orchestrator
      • Airflow Orchestrator
      • Skypilot VM Orchestrator
      • HyperAI Orchestrator
      • Lightning AI Orchestrator
      • Develop a custom orchestrator
    • Artifact Stores
      • Local Artifact Store
      • Amazon Simple Cloud Storage (S3)
      • Google Cloud Storage (GCS)
      • Azure Blob Storage
      • Develop a custom artifact store
    • Container Registries
      • Default Container Registry
      • DockerHub
      • Amazon Elastic Container Registry (ECR)
      • Google Cloud Container Registry
      • Azure Container Registry
      • GitHub Container Registry
      • Develop a custom container registry
    • Step Operators
      • Amazon SageMaker
      • AzureML
      • Google Cloud VertexAI
      • Kubernetes
      • Modal
      • Spark
      • Develop a Custom Step Operator
    • Experiment Trackers
      • Comet
      • MLflow
      • Neptune
      • Weights & Biases
      • Google Cloud VertexAI Experiment Tracker
      • Develop a custom experiment tracker
    • Image Builders
      • Local Image Builder
      • Kaniko Image Builder
      • AWS Image Builder
      • Google Cloud Image Builder
      • Develop a Custom Image Builder
    • Alerters
      • Discord Alerter
      • Slack Alerter
      • Develop a Custom Alerter
    • Annotators
      • Argilla
      • Label Studio
      • Pigeon
      • Prodigy
      • Develop a Custom Annotator
    • Data Validators
      • Great Expectations
      • Deepchecks
      • Evidently
      • Whylogs
      • Develop a custom data validator
    • Feature Stores
      • Feast
      • Develop a Custom Feature Store
    • Model Deployers
      • MLflow
      • Seldon
      • BentoML
      • Hugging Face
      • Databricks
      • vLLM
      • Develop a Custom Model Deployer
    • Model Registries
      • MLflow Model Registry
      • Develop a Custom Model Registry
  • Service Connectors
    • Introduction
    • Complete guide
    • Best practices
    • Connector Types
      • Docker Service Connector
      • Kubernetes Service Connector
      • AWS Service Connector
      • GCP Service Connector
      • Azure Service Connector
      • HyperAI Service Connector
  • Popular Stacks
    • AWS
    • Azure
    • GCP
    • Kubernetes
  • Deployment
    • 1-click Deployment
    • Terraform Modules
    • Register a cloud stack
    • Infrastructure as code
  • Contribute
    • Custom Stack Component
    • Custom Integration
Powered by GitBook
On this page
  • 1. Set up proper credentials
  • 2. Create a resource group and the AzureML instance
  • 3. Create the required role assignments
  • 4. Create a service connector
  • 5. Create Stack Components
  • Artifact Store (Azure Blob Storage)
  • Orchestrator (AzureML)
  • Container Registry (Azure Container Registry)
  • 6. Create a Stack
  • 7. ...and you are done.

Was this helpful?

Edit on GitHub
  1. Popular Stacks

Azure

A simple guide to create an Azure stack to run your ZenML pipelines

PreviousAWSNextGCP

Last updated 12 days ago

Was this helpful?

This page aims to quickly set up a minimal production stack on Azure. With just a few simple steps, you will set up a resource group, a service principal with correct permissions, and the relevant ZenML stack and components.

Would you like to skip ahead and deploy a full Azure ZenML cloud stack already?

Check out the , the , or for a shortcut on how to deploy & register this stack.

To follow this guide, you need:

  • An active Azure account.

  • ZenML .

  • ZenML azure integration installed with zenml integration install azure.

1. Set up proper credentials

You can start by on Azure:

  1. Go to the App Registrations on the Azure portal.

  2. Click on + New registration,

  3. Give it a name and click register.

Once you create the service principal, you will get an Application ID and Tenant ID as they will be needed later.

Next, go to your service principal and click on the Certificates & secrets in the Manage menu. Here, you have to create a client secret. Note down the secret value as it will be needed later.

2. Create a resource group and the AzureML instance

Once the resource group is created, go to the overview page of your new resource group and click + Create. This will open up the marketplace where you can select a variety of resources to create. Look for Azure Machine Learning.

Select it, and you will start the process of creating an AzureML workspace. As you can see from the Workspace details, AzureML workspaces come equipped with a storage account, key vault, and application insights. It is highly recommended that you create a container registry as well.

3. Create the required role assignments

Now, that you have your app registration and the resources, you have to create the corresponding role assignments. In order to do this, go to your resource group, open up Access control (IAM) on the left side and +Add a new role assignment.

In the role assignment page, search for AzureML, which will show you a list of roles defined with the scope of AzureML workspaces.

One by one, you have to select AzureML Compute Operator, AzureML Data Scientist, and AzureML Registry User and click Next.

Finally, click +Select Members, search for your registered app by its ID, and assign the role accordingly.

4. Create a service connector

zenml service-connector register azure_connector --type azure \
  --auth-method service-principal \
  --client_secret=<CLIENT_SECRET> \
  --tenant_id=<TENANT_ID> \
  --client_id=<APPLICATION_ID>

You will use this service connector later on to connect your components with proper authentication.

5. Create Stack Components

In order to run any workflows on Azure using ZenML, you need an artifact store, an orchestrator, and a container registry.

Artifact Store (Azure Blob Storage)

For the artifact store, we will be using the storage account attached to our AzureML workspace. But before registering the component itself, you have to create a container for blob storage. To do this, go to the corresponding storage account in your workspace and create a new container:

Once you create the container, you can go ahead, register your artifact store using its path and connect it to your service connector:

zenml artifact-store register azure_artifact_store -f azure \
  --path=<PATH_TO_YOUR_CONTAINER> \ 
  --connector azure_connector

Orchestrator (AzureML)

As for the orchestrator, no additional setup is needed. Simply use the following command to register it and connect it to your service connector:

zenml orchestrator register azure_orchestrator -f azureml \
    --subscription_id=<YOUR_AZUREML_SUBSCRIPTION_ID> \
    --resource_group=<NAME_OF_YOUR_RESOURCE_GROUP> \
    --workspace=<NAME_OF_YOUR_AZUREML_WORKSPACE> \ 
    --connector azure_connector

Container Registry (Azure Container Registry)

Similar to the orchestrator, you can register and connect your container registry using the following command:

zenml container-registry register azure_container_registry -f azure \
  --uri=<URI_TO_YOUR_AZURE_CONTAINER_REGISTRY> \ 
  --connector azure_connector

6. Create a Stack

Now, you can use the registered components to create an Azure ZenML stack:

zenml stack register azure_stack \
    -o azure_orchestrator \
    -a azure_artifact_store \
    -c azure_container_registry \
    --set

7. ...and you are done.

Just like that, you now have a fully working Azure stack ready to go. Feel free to take it for a spin by running a pipeline on it.

Define a ZenML pipeline:

from zenml import pipeline, step

@step
def hello_world() -> str:
    return "Hello from Azure!"

@pipeline
def azure_pipeline():
    hello_world()

if __name__ == "__main__":
    azure_pipeline()

Save this code to run.py and execute it. The pipeline will use Azure Blob Storage for artifact storage, AzureML for orchestration, and an Azure container registry.

python run.py

Now that you have a functional Azure stack set up with ZenML, you can explore more advanced features and capabilities offered by ZenML. Some next steps to consider:

Now, you have to . To do this, go to the Azure portal and go to the Resource Groups page, and click + Create.

Now you have everything set up, you can go ahead and create .

For more information regarding Azure Blob Storage artifact stores, feel free to .

For more information regarding AzureML orchestrator, feel free to .

For more information regarding Azure container registries, feel free to .

Dive deeper into ZenML's to learn best practices for deploying and managing production-ready pipelines.

Explore ZenML's with other popular tools and frameworks in the machine learning ecosystem.

Join the to connect with other users, ask questions, and get support.

create a resource group on Azure
a ZenML Azure Service Connector
check the docs
check the docs
check the docs
production guide
integrations
ZenML community
in-browser stack deployment wizard
stack registration wizard
the ZenML Azure Terraform module
installed
creating a service principal by creating an app registration
ZenML Scarf
Azure App Registrations
Azure App Registrations
Azure App Registrations
Azure Resource Groups
Azure Role Assignments
Azure Role Assignments
Azure Resource Groups
Azure Resource Groups
Azure Resource Groups
Azure Blob Storage