Logging
Learn how to control and customize logging behavior in ZenML pipelines.
By default, ZenML uses a logging handler to capture two types of logs:
Pipeline run logs: Logs collected from your ZenML client while triggering and waiting for a pipeline to run. These logs cover everything that happens client-side: building and pushing container images, triggering the pipeline, waiting for it to start, and waiting for it to finish. These logs are now stored in the artifact store, making them accessible even after the client session ends.
Step logs: Logs collected from the execution of individual steps. These logs only cover what happens during the execution of a single step and originate mostly from the user-provided step code and the libraries it calls.
For step logs, users are free to use the default python logging module or print statements, and ZenML's logging handler will catch these logs and store them.
import logging
from zenml import step
@step
def my_step() -> None:
logging.warning("`Hello`") # You can use the regular `logging` module.
print("World.") # You can utilize `print` statements as well. All these logs are stored within the respective artifact store of your stack. You can visualize the pipeline run logs and step logs in the dashboard as follows:
Local ZenML server (
zenml login --local): Both local and remote artifact stores may be accessibleDeployed ZenML server: Local artifact store logs won't be accessible; remote artifact store logs require service connector configuration (see remote storage guide)
In order for logs to be visible in the dashboard with a deployed ZenML server, you must configure both a remote artifact store and the appropriate service connector to access it. Without this configuration, your logs won't be accessible through the dashboard.


Logging Configuration
Environment Variables and Remote Execution
For all logging configurations below, note:
Setting environment variables on your local machine only affects local pipeline runs
For remote pipeline runs, you must set these variables in the pipeline's execution environment using Docker settings:
Enabling or Disabling Logs Storage
You can control log storage for both pipeline runs and steps:
Step Logs
To disable storing step logs in your artifact store:
Using the
enable_step_logsparameter with step decorator:Setting the
ZENML_DISABLE_STEP_LOGS_STORAGE=trueenvironment variable in the execution environment:This environment variable takes precedence over the parameter mentioned above.
Pipeline Run Logs
To disable storing client-side pipeline run logs in your artifact store:
Using the
enable_pipeline_logsparameter with pipeline decorator:Using the runtime configuration:
Setting the
ZENML_DISABLE_PIPELINE_LOGS_STORAGE=trueenvironment variable:The environment variable takes precedence over parameters set in the decorator or runtime configuration.
Setting Logging Verbosity
Change the default logging level (INFO) with:
Options: INFO, WARN, ERROR, CRITICAL, DEBUG
For remote pipeline runs:
Setting Logging Format
Change the default logging format with:
The format must use %-string formatting style. See available attributes.
Disabling Rich Traceback Output
ZenML uses rich for enhanced traceback display. Disable it with:
Disabling Colorful Logging
Disable colorful logging with:
Disabling Step Names in Logs
By default, ZenML adds step name prefixes to console logs:
These prefixes only appear in console output, not in stored logs. Disable them with:
Best Practices for Logging
Use appropriate log levels:
DEBUG: Detailed diagnostic informationINFO: Confirmation that things work as expectedWARNING: Something unexpected happenedERROR: A more serious problem occurredCRITICAL: A serious error that may prevent continued execution
Include contextual information in logs
Log at decision points to track execution flow
Avoid logging sensitive information
Use structured logging when appropriate
Configure appropriate verbosity for different environments
See Also
Last updated
Was this helpful?