This page describes how to access and view the Apache Airflow logs for Cloud Composer.
Cloud Composer includes the following Airflow logs:
- Airflow logs: These logs are associated with single DAG tasks. You can view
the task logs in the Cloud Storage
logsfolder associated with the Cloud Composer environment. You can also view the logs in the Airflow web interface.
- Streaming logs: These logs are a superset of the logs in Airflow. You can view the streaming logs in the Logs Viewer in the Google Cloud Platform Console. You can also view the streaming logs in Stackdriver. For more information about Stackdriver, see Monitoring Cloud Composer Environments.
Logs in Cloud Storage
When you create an environment, Cloud Composer creates a
Cloud Storage bucket and associates the bucket with your environment.
Cloud Composer stores the logs for single DAG tasks in
logs folder in
the bucket. To determine the bucket associated with the environment,
see Adding and Updating DAGs.
Log folder directory structure
logs folder includes folders for each workflow that has run
on the environment. Each workflow folder includes a folder for its DAGs and sub-DAGs.
Each folder contains the log files for each task. The task filename indicates
when the task started.
The following example shows the logs directory structure for an environment.
us-central1-my-environment-60839224-bucket └───dags | │ | | dag_1 | | dag_2 | | ... | └───logs │ └───workflow_1 | │ | └───dag_1 | | │ datefile_1 | | │ datefile_2 | | │ ... | | | └───sub_dag_1 | │ datefile_1 | │ datefile_2 | │ ... | └───workflow_2 │ ...
To prevent data loss, the logs saved in Cloud Storage remain in storage after you delete your environment. You must manually delete the logs from Cloud Storage.
Viewing task logs in Cloud Storage
To view the log files for DAG tasks:
To view log files, enter the following command, replacing the VARIABLES with appropriate values:
gsutil ls -r gs://BUCKET/logs
(Optional) To copy a single log or a subfolder, enter the following command, replacing the VARIABLES with appropriate values:
gsutil cp -r gs://BUCKET/logs/PATH_TO_LOG_FILE LOCAL_FILE_OR_DIRECTORY
Viewing streaming logs in the GCP Console
Cloud Composer produces the following logs:
- airflow-scheduler: The logs generated by the Airflow scheduler.
- airflow-worker: The logs generated as part of workflow and DAG execution.
To view the streaming log files:
Go to the Logs Viewer in the GCP Console.
GO TO THE LOGS VIEWER
Select the logs you want to see.
You can filter by properties such as log file and level, predefined label, task name, workflow, and execution date. For more information about selecting and filtering logs, see Viewing Logs.To learn about exporting logs, see Exporting with the Logs Viewer.