run, a pipeline is compiled and passed directly to the orchestrator, to be run in the orchestrator environment.
TrainingPipeline) within ZenML are designed to have easy interfaces to add pre-decided steps, with the order also pre-decided. Other sorts of pipelines can be created as well from scratch.
stepinterface, but there will be other more customized interfaces (layered in a hierarchy) for specialized implementations. For example, broad steps like
@splitand so on. Conceptually, a
Stepis a discrete and independent part of a pipeline that is responsible for one particular aspect of data manipulation inside a ZenML pipeline.
standardsteps found in
zenml.core.steps.*for users to get started. For example, a
SplitStepis responsible for splitting the data into various split's like
evalfor downstream steps to then use. However, in essence, virtually any Python function can be a ZenML step as well.
TFRecords or saved model pickles, depending on what the step produces.The serialization and deserialization logic of artifacts is defined by Materializers.
BaseMaterializerclass. We care about this because steps are not just isolated pieces of work; they are linked together and the outputs of one step might well be the inputs of the next.
fileioutilities to do the disk operations without needing to be concerned with whether we're operating on a local or cloud machine.
BaseSettingsclass, which means that there are multiple ways to use it.