Viewing Airflow logs

This page describes how to access and view the Apache Airflow logs for Cloud Composer.

Log types

Cloud Composer includes the following Airflow logs:

Logs in Cloud Storage

When you create an environment, Cloud Composer creates a Cloud Storage bucket and associates the bucket with your environment. Cloud Composer stores the logs for single DAG tasks in logs folder in the bucket. For the bucket name, see Determining the storage bucket name.

Log folder directory structure

The logs folder includes folders for each workflow that has run on the environment. Each workflow folder includes a folder for its DAGs and sub-DAGs. Each folder contains the log files for each task. The task filename indicates when the task started.

The following example shows the logs directory structure for an environment.

   |   │
   |   |   dag_1
   |   |   dag_2
   |   |   ...
       |   │
       |   └───task_1
       |   |   │   datefile_1
       |   |   │   datefile_2
       |   |   │   ...
       |   |
       |   └───task_2
       |       │   datefile_1
       |       │   datefile_2
       |       │   ...
           │   ...

Log retention

To prevent data loss, the logs saved in Cloud Storage remain in storage after you delete your environment. You must manually delete the logs from Cloud Storage.

Before you begin

The following permission is required to access Airflow logs in the Cloud Storage bucket for the Cloud Composer environment: storage.objectAdmin. For more information, see Cloud Composer Access control.

Viewing task logs in Cloud Storage

To view the log files for DAG tasks:

  1. To view log files, enter the following command, replacing the VARIABLES with appropriate values:

    gsutil ls -r gs://BUCKET/logs

  2. (Optional) To copy a single log or a subfolder, enter the following command, replacing the VARIABLES with appropriate values:


Viewing streaming logs in the Cloud Console

Cloud Composer produces the following logs:

  • airflow: The uncategorized logs that Airflow pods generate.
  • airflow-database-init-job: The logs Airflow database initialization job generates.
  • airflow-scheduler: The logs the Airflow scheduler generates.
  • dag-processor-manager: The logs of the DAG processor manager (the part of the scheduler that processes DAG files).
  • airflow-webserver: The logs the Airflow web interface generates.
  • airflow-worker: The logs generated as part of workflow and DAG execution.
  • The logs Admin Activity generates.
  • composer-agent: The logs generated as part of create and update environment operations.
  • gcs-syncd: The logs generated by the file syncing processes.
  • build-log-worker-scheduler: The logs from the local build of the Airflow worker image (during upgrades and Python package installation).
  • build-log-webserver: The logs from the build of the Airflow webserver image (during upgrades and python package installation).
  • airflow-monitoring: The logs that Airflow monitoring generates.

To view the streaming log files:

  1. Go to the Google Cloud's operations suite Logs Viewer in the Cloud Console.

  2. Select the logs you want to see.

    You can filter by properties such as log file and level, predefined label, task name, workflow, and execution date. For more information about selecting and filtering logs, see Viewing Logs.To learn about exporting logs, see Exporting with the Logs Viewer.

What's next