TFRecord
s or saved model pickles, depending on what the step produces.The serialization and deserialization logic of artifacts is defined by Materializers.init
command serves to get you started, and then you can provision the infrastructure that you wish to work with easily using a simple stack register
command with the relevant arguments passed in.BaseMaterializer
class. We care about this because steps are not just isolated pieces of work; they are linked together and the outputs of one step might well be the inputs of the next.fileio
utilities to do the disk operations without needing to be concerned with whether we're operating on a local or cloud machine.sqlite
or mysql
.run
, a pipeline is compiled and passed directly to the orchestrator, to be run in the orchestrator environment..zen
folder in your project root where various information about your local configuration lives, e.g., the active Stack that you are using to run pipelines, is stored.run.py
and located at the root of a ZenML repository, which has the code to actually create a pipeline run. The code usually looks like this:AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
as well as an optional AWS_SESSION_TOKEN
. If you don't specify a schema at the point of registration, ZenML will set the schema as ArbitrarySecretSchema
, a kind of default schema where things that aren't attached to a grouping can be stored.access_key_id
and a secret_access_key
which it (usually) stores in your ~/.aws/credentials
file.BaseSettings
class, which means that there are multiple ways to use it.