Log Stores
Storing and retrieving logs from your ML pipelines.
The log store is a stack component responsible for collecting, storing, and retrieving logs generated during pipeline and step execution. It captures everything from standard logging output to print statements and any messages written to stdout/stderr, making it easy to debug and monitor your ML workflows.
How it works
ZenML's log capture system is designed to be comprehensive and non-intrusive. Here's what happens under the hood:
stdout/stderr wrapping: ZenML wraps the standard output and error streams to capture all printed messages and any output directed to these streams.
Root logger handler: A custom handler is added to Python's root logger to capture all log messages with proper metadata from loggers that propagate to the root.
Log routing: All captured messages are routed through a
LoggingContextto the active log store in your stack.
This approach ensures that you don't miss any output from your pipeline steps, including:
Standard Python
loggingmessagesprint()statementsOutput from third-party libraries
Messages from subprocesses that write to stdout/stderr
When to use it
The Log Store is automatically used in every ZenML stack. If you don't explicitly configure a log store, ZenML will use an Artifact Log Store by default, which stores logs in your artifact store.
You should consider configuring a dedicated log store when:
You want to use a centralized logging backend like Datadog, Jaeger, Grafana Tempo, Honeycomb, Lightstep or Dash0 for log aggregation and analysis
You need advanced log querying capabilities beyond what file-based storage provides
You're running pipelines at scale and need better log management
You want to integrate with your organization's existing observability infrastructure
How to use it
By default, if no log store is explicitly configured in your stack, ZenML automatically creates an Artifact Log Store that uses your artifact store for log storage. This means logging works out of the box without any additional configuration.
To use a different log store, you need to register it and add it to your stack:
Once configured, logs are automatically captured during pipeline execution.
Viewing Logs
You can view logs through several methods:
ZenML Dashboard: Navigate to a pipeline run and view step logs directly in the UI.
Programmatically: You can fetch logs directly using the log store:
External platforms: For log stores like Datadog, you can also view logs directly in the platform's native interface.
Log Store Flavors
ZenML provides several log store flavors out of the box:
artifact
built-in
Default log store that writes logs to your artifact store. Zero configuration required.
otel
built-in
Generic OpenTelemetry log store for any OTEL-compatible backend. Does not support log fetching.
datadog
built-in
Exports logs to Datadog's log management platform with full fetch support.
If you would like to see the available flavors of log stores, you can use the command:
Last updated
Was this helpful?