TFRecords or saved model pickles, depending on what the step produces.The serialization and deserialization logic of artifacts is defined by Materializers.
initcommand serves to get you started, and then you can provision the infrastructure that you wish to work with using the
stack registercommand with the relevant arguments passed in.
BaseMaterializerclass. We care about this because steps are not just isolated pieces of work; they are linked together and the outputs of one step might well be the inputs of the next.
fileioutilities to do the disk operations without needing to be concerned with whether we're operating on a local or cloud machine.
run, a pipeline is compiled and passed directly to the orchestrator, to be run in the orchestrator environment.
.zenfolder in your project root where various information about your local configuration lives, e.g., the active Stack that you are using to run pipelines, is stored.
run.pyand located at the root of a ZenML repository, which has the code to actually create a pipeline run. The code usually looks like this:
AWS_SECRET_ACCESS_KEYas well as an optional
AWS_SESSION_TOKEN. If you don't specify a schema at the point of registration, ZenML will set the schema as
ArbitrarySecretSchema, a kind of default schema where things that aren't attached to a grouping can be stored.
secret_access_keywhich it (usually) stores in your
BaseSettingsclass, which means that there are multiple ways to use it.