Post Execution Workflow

Post-execution workflow

After executing a pipeline, the user needs to be able to fetch it from history and perform certain tasks. This page captures these workflows at an orbital level.

Component Hierarchy

In the context of a post-execution workflow, there is an implied hierarchy of some basic ZenML components:
1
repository -> pipelines -> runs -> steps -> outputs
2
3
# where -> implies a 1-many relationship.
Copied!

Repository

The highest level repository object is where to start from.
1
from zenml.core.repo import Repository
2
3
repo = Repository()
Copied!

Pipelines

1
# get all pipelines from all stacks
2
pipelines = repo.get_pipelines()
3
4
# or get one pipeline by name and/or stack key
5
pipeline = repo.get_pipeline(pipeline_name=..., stack_key=...)
Copied!

Runs

1
runs = pipeline.runs # all runs of a pipeline chronlogically ordered
2
run = runs[-1] # latest run
Copied!

Steps

1
# at this point we switch from the `get_` paradigm to properties
2
steps = run.steps # all steps of a pipeline
3
step = steps[0]
4
print(step.name)
Copied!

Outputs

1
# The outputs of a step
2
# if multiple outputs they are accessible by name
3
outputs = step.outputs["step_name"]
4
5
# if one output, use the `.output` property instead
6
output = step.output
7
8
# will get you the value from the original materializer used in the pipeline
9
output.read()
Copied!

Materializing outputs (or inputs)

Once an output artifact is acquired from history, one can visualize it with any chosen Visualizer.
1
df = output.read(materializer_class=PandasMaterializer)
2
df.head()
Copied!

Retrieving Model

1
model = output.read(materializer_class=KerasModelMaterializer)
2
model # read keras.Model
Copied!

Visuals

Seeing statistics

1
from zenml.integrations.facets.visualizers.facet_statistics_visualizer import (
2
FacetStatisticsVisualizer,
3
)
4
5
FacetStatisticsVisualizer().visualize(output)
Copied!
It produces the following visualization:
Statistics for boston housing dataset
Last modified 13d ago