BaseStepConfig
. When such a config object is passed to a step, it is not treated like other artifacts. Instead, it gets passed into the step when the pipeline is instantiated.gamma
parameter is set to 0.001
. However, when the pipeline is instantiated you can override the default like this:BaseStepConfig
is implemented as a Pydantic BaseModel. Therefore, any type that Pydantic supports is also supported as an attribute type in the BaseStepConfig
.pipeline.with_config()
before calling pipeline.run()
,with_config()
method, e.g.:path_to_config.yaml
needs to have the following structure:svc_trainer()
still has to be defined to have a config: SVCTrainerStepConfig
argument. The difference here is only that we provide gamma
via a config file before running the pipeline, instead of explicitly passing a SVCTrainerStepConfig
object during the step creation.zenml pipeline run
command with -c
argument:<PATH_TO_PIPELINE_PYTHON_FILE>
should point to the Python file where your pipeline function or class is defined. Your steps can also be in that file, but they do not need to. If your steps are defined in separate code files, you can instead specify that in the YAML, as we will see below.<PATH_TO_PIPELINE_PYTHON_FILE>
,<PATH_TO_PIPELINE_PYTHON_FILE>
),@pipeline
decorator, this name is the name of the decorated function. If you used the Class Based API (which you will learn about in the next section), it will be the name of the class.my_pipeline_a
in pipelines/my_pipelines.py
, then you would:name: my_pipeline_a
in the YAML,pipelines/my_pipelines.py
as <PATH_TO_PIPELINE_PYTHON_FILE>
.source
field:my_step_1
in steps/my_steps.py
that you want to use as step_1
of your pipeline my_pipeline_a
, then you would define that in your YAML like this:file: ...
line.materializers
field of a step can be used to specify custom materializers of your step outputs and inputs.