Deploy a cloud stack with ZenML
Deploy a cloud stack from scratch with a single click
Last updated
Deploy a cloud stack from scratch with a single click
Last updated
In ZenML, the stack is a fundamental concept that represents the configuration of your infrastructure. In a normal workflow, creating a stack requires you to first deploy the necessary pieces of infrastructure and then define them as stack components in ZenML with proper authentication.
Especially in a remote setting, this process can be challenging and time-consuming, and it may create multi-faceted problems. This is why we implemented a feature, that allows you to deploy the necessary pieces of infrastructure on your selected cloud provider and get you started on remote stack with a single click.
If you prefer to have more control over where and how resources are provisioned in your cloud, you can use one of our Terraform modules to manage your infrastructure as code yourself.
If you have the required infrastructure pieces already deployed on your cloud, you can also use the stack wizard to seamlessly register your stack.
The first thing that you need in order to use this feature is a deployed instance of ZenML (not a local server via zenml login --local
). If you do not already have it set up for you, feel free to learn how to do so here.
Once you are connected to your deployed ZenML instance, you can use the 1-click deployment tool either through the dashboard or the CLI:
In order to create a remote stack over the dashboard go to the stacks page on the dashboard and click "+ New Stack".
Since we will be deploying it from scratch, select "New Infrastructure" on the next page:
Here is an overview of the infrastructure that the 1-click deployment will prepare for you based on your cloud provider:
An S3 bucket that will be used as a ZenML Artifact Store.
An ECR container registry that will be used as a ZenML Container Registry.
A CloudBuild project that will be used as a ZenML Image Builder.
Permissions to use SageMaker as a ZenML Orchestrator and Step Operator.
An IAM user and IAM role with the minimum necessary permissions to access the resources listed above.
An AWS access key used to give access to ZenML to connect to the above resources through a ZenML service connector.
The configured IAM service account and AWS access key will grant ZenML the following AWS permissions in your AWS account:
S3 Bucket:
s3:ListBucket
s3:GetObject
s3:PutObject
s3:DeleteObject
s3:GetBucketVersioning
s3:ListBucketVersions
s3:DeleteObjectVersion
ECR Repository:
ecr:DescribeRepositories
ecr:ListRepositories
ecr:DescribeRegistry
ecr:BatchGetImage
ecr:DescribeImages
ecr:BatchCheckLayerAvailability
ecr:GetDownloadUrlForLayer
ecr:InitiateLayerUpload
ecr:UploadLayerPart
ecr:CompleteLayerUpload
ecr:PutImage
ecr:GetAuthorizationToken
CloudBuild (Client):
codebuild:CreateProject
codebuild:BatchGetBuilds
CloudBuild (Service):
s3:GetObject
s3:GetObjectVersion
logs:CreateLogGroup
logs:CreateLogStream
logs:PutLogEvents
ecr:BatchGetImage
ecr:DescribeImages
ecr:BatchCheckLayerAvailability
ecr:GetDownloadUrlForLayer
ecr:InitiateLayerUpload
ecr:UploadLayerPart
ecr:CompleteLayerUpload
ecr:PutImage
ecr:GetAuthorizationToken
SageMaker (Client):
sagemaker:CreatePipeline
sagemaker:StartPipelineExecution
sagemaker:DescribePipeline
sagemaker:DescribePipelineExecution
SageMaker (Jobs):
AmazonSageMakerFullAccess
There you have it! With a single click, you just deployed a cloud stack and, you can start running your pipelines on a remote setting.