Hyperparameter tuning
Running a hyperparameter tuning trial with ZenML.
Hyperparameter Tuning
A basic iteration through a number of hyperparameters can be achieved with ZenML by using a simple pipeline like this:
This is an implementation of a basic grid search (across a single dimension) that would allow for a different learning rate to be used across the same train_step
. Once that step has been run for all the different learning rates, the select_model_step
finds which hyperparameters gave the best results or performance.
The main challenge of this implementation is that it is currently not possible to pass a variable number of artifacts into a step programmatically, so the select_model_step
needs to query all artifacts produced by the previous steps via the ZenML Client instead:
Last updated