Google Cloud Storage (GCS)
Storing artifacts using GCP Cloud Storage.
Last updated
Was this helpful?
Storing artifacts using GCP Cloud Storage.
Last updated
Was this helpful?
The GCS Artifact Store is an flavor provided with the GCP ZenML integration that uses to store ZenML artifacts in a GCP Cloud Storage bucket.
Running ZenML pipelines with is usually sufficient if you just want to evaluate ZenML or get started quickly without incurring the trouble and the cost of employing cloud storage services in your stack. However, the local Artifact Store becomes insufficient or unsuitable if you have more elaborate needs for your project:
if you want to share your pipeline run results with other team members or stakeholders inside or outside your organization
if you have other components in your stack that are running remotely (e.g. a Kubeflow or Kubernetes Orchestrator running in a public cloud).
if you outgrow what your local machine can offer in terms of storage space and need to use some form of private or public storage service that is shared with others
if you are running pipelines at scale and need an Artifact Store that can handle the demands of production-grade MLOps
In all these cases, you need an Artifact Store that is backed by a form of public cloud or self-hosted shared object storage service.
You should use the GCS Artifact Store when you decide to keep your ZenML artifacts in a shared object storage and if you have access to the Google Cloud Storage managed service. You should consider one of the other if you don't have access to the GCP Cloud Storage service.
The GCS Artifact Store flavor is provided by the GCP ZenML integration, you need to install it on your local machine to be able to register a GCS Artifact Store and add it to your stack:
The only configuration parameter mandatory for registering a GCS Artifact Store is the root path URI, which needs to point to a GCS bucket and take the form gs://bucket-name
. Please read on how to configure a GCS bucket.
With the URI to your GCS bucket known, registering a GCS Artifact Store can be done as follows:
Certain dashboard functionality, such as visualizing or deleting artifacts, is not available when using an implicitly authenticated artifact store together with a deployed ZenML server because the ZenML server will not have permission to access the filesystem.
The implicit authentication method also needs to be coordinated with other stack components that are highly dependent on the Artifact Store and need to interact with it directly to the function. If these components are not running on your machine, they do not have access to the local Google Cloud CLI configuration and will encounter authentication failures while trying to access the GCS Artifact Store:
Depending on your use case, however, you may also need to provide additional configuration parameters pertaining to to match your deployment scenario.
Integrating and using a GCS Artifact Store in your pipelines is not possible without employing some form of authentication. If you're looking for a quick way to get started locally, you can use the Implicit Authentication method. However, the recommended way to authenticate to the GCP cloud platform is through . This is particularly useful if you are configuring ZenML stacks that combine the GCS Artifact Store with other remote stack components also running in GCP.
This method uses the implicit GCP authentication available in the environment where the ZenML code is running. On your local machine, this is the quickest way to configure a GCS Artifact Store. You don't need to supply credentials explicitly when you register the GCS Artifact Store, as it leverages the local credentials and configuration that the Google Cloud CLI stores on your local machine. However, you will need to install and set up the Google Cloud CLI on your machine as a prerequisite, as covered in , before you register the GCS Artifact Store.
need to access the Artifact Store to manage pipeline artifacts
need to access the Artifact Store to manage step-level artifacts
need to access the Artifact Store to load served models
To enable these use cases, it is recommended to use to link your GCS Artifact Store to the remote GCS bucket.
To set up the GCS Artifact Store to authenticate to GCP and access a GCS bucket, it is recommended to leverage the many features provided by such as auto-configuration, best security practices regarding long-lived credentials and reusing the same credentials across multiple stack components.
A non-interactive CLI example that leverages on your local machine to auto-configure a GCP Service Connector targeting a single GCS bucket is:
Note: Please remember to grant the entity associated with your GCP credentials permissions to read and write to your GCS bucket as well as to list accessible GCS buckets. For a full list of permissions required to use a GCP Service Connector to access one or more GCS buckets, please refer to the or read the documentation available in the interactive CLI commands and dashboard. The GCP Service Connector supports with different levels of security and convenience. You should pick the one that best fits your use case.
When you register the GCS Artifact Store, you can , store it in a and then reference it in the Artifact Store configuration.
For this method, you need to , grant it privileges to read and write to your GCS bucket (i.e. use the Storage Object Admin
role) and then .
For more, up-to-date information on the GCS Artifact Store implementation and its configuration, you can have a look at .
Aside from the fact that the artifacts are stored in GCP Cloud Storage, using the GCS Artifact Store is no different from .