Dataflow provides an Execution details tab in its web-based monitoring user interface. This tool can help you optimize performance for your jobs and diagnose why your job might be slow or stuck. This document is for any Dataflow user who needs to inspect the execution details of their Dataflow jobs.
This page provides a high-level summary of what you can use this feature for and the user interface layout. For troubleshooting details, read Using the Execution details tab.
To use execution details effectively, you need to understand how the following key concepts apply to Dataflow jobs:
- Fusion optimization: The process of Dataflow fusing multiple steps or transforms. This optimizes user-submitted pipelines. For more information, read Fusion optimization.
- Stages: The unit of fused steps in Dataflow pipelines.
- Critical paths: The sequence of stages of a pipeline that contributed to
the overall job runtime. For example, this sequence excludes the following
- Branches of the pipeline that finished earlier than the overall job.
- Inputs that did not delay downstream processing.
- Workers: Compute Engine VM instances running a Dataflow job.
- Work items: The units of work that corresponds to a bundle selected by Dataflow.
When to use execution details
The following are common scenarios for using execution details when running Dataflow jobs:
- Your pipeline is stuck and you want to troubleshoot the issue.
- Your pipeline is slow and you want to target pipeline optimization.
- Nothing needs to be fixed, but you want to see the execution details of your pipeline to understand your job.
Enabling execution details
The Stage Workflow view is automatically enabled for all batch and streaming jobs. For batch jobs, to display additional monitoring information within the Execution details tab, when you launch your Dataflow job, pass in the following parameters:
For batch jobs, this enables the Stage progress and Worker progress views.
This feature does not cause additional CPU, network, etc. usage for your VMs. The execution details are collected by Dataflow's backend monitoring system which does not affect the performance of the job.
Once you launch your job, you can view the Execution details tab using the Dataflow monitoring UI. For more information, read Accessing the Dataflow monitoring interface.
How the Execution details tab is used
The Execution details tab includes four views: Stage progress, Side panel (within Stage progress), Stage workflow, and Worker progress. This section walks you through each view and provides examples of successful and unsuccessful Dataflow jobs.
The Stage progress views shows the execution stages of the job, arranged by their start and end times. The length of time is represented with a bar. For example, you can visually identify the longest running stages of a pipeline by finding the longest bar.
Below each of the bars, you can find a sparkline that shows the progress of the stage over time. To highlight the stages that contributed to the overall runtime of the job, click the Critical path toggle.
The Stage info panel displays a list of steps associated with a stage, ranked by descending wall time. To open this panel, hover over one of the bars and click View details.
Stage workflow shows the execution stages of the job, represented as a workflow graph. To show only the stages that directly contributed to the overall runtime of the job, click the Critical path toggle.
Worker progress shows the workers for a particular stage. Each bar maps to a work item scheduled to a worker. You can find a sparkline that tracks CPU utilization on a worker located below each worker, making it easier to spot underutilization issues.
Due to the density of this visualization, you must filter this view by pre-selecting a stage. First, identify a stage in the Stage progress view. Hover over that stage and click View workers to enter the Worker progress view.
- Learn more about Using the Execution details tab for troubleshooting.
- Read about the different components of Dataflow's web-based monitoring user interface.