On September 15, 2026, all Cloud Composer 1 versions and versions 2.0.x of Cloud Composer 2 will reach their planned end of life. You will not be able to use environments with these versions. We recommend planning migration to Cloud Composer 3. Cloud Composer 2 versions 2.1.x and later are still supported and are not impacted by this change.
Cloud Composer synchronizes specific folders in your environment's bucket to Airflow components that run in your environment. See Data stored in Cloud Storage for more information. This page refers to issues that could disrupt the synchronization process and how to troubleshoot them.
Common Issues
The following sections describe symptoms and potential fixes for some common file synchronization issues.
Handling a large number of DAGs and plugins in dags and plugins folders
Contents of /dags and /plugins folders are synchronized from
your environment's bucket to local file systems of Airflow workers and
schedulers.
The more data stored in these folders, the longer it takes to perform the
synchronization. To address such situations:
Limit the number of files in /dags and /plugins folders. Store only the
minimum of required files.
Increase the disk space available to Airflow schedulers and workers.
Increase CPU and memory of Airflow schedulers and workers, so
that the sync operation is performed faster.
In case of a very large number of DAGs, divide DAGs into batches, compress
them into zip archives and deploy these archives into the /dags folder.
This approach speeds up the DAGs syncing process. Airflow components
extract zip archives before processing DAGs.
Generating DAGs in a programmatic way might also be a method for limiting
the number of DAG files stored in the /dags folder.
See the Programmatic DAGs section in the DAGs Troubleshooting page to avoid
problems with scheduling and executing DAGs generated programmatically.
Anti-patterns impacting DAGs and plugins syncing to schedulers, workers and web servers
Cloud Composer synchronizes the content of /dags and /plugins
folders to schedulers and workers. Certain objects in /dags and /plugins
folders might prevent this synchronization to work correctly or slow it down.
The /dags folder is synchronized to schedulers and workers.
This folder is not synchronized to the web server.
The /plugins folder is synchronized to schedulers, workers and web servers.
You might encounter the following issues:
You uploaded gzip-compressed files that use
[compression transcoding][storage-transcoding] to /dags and /plugins
folders. It usually happens if you use the --gzip-local-all flag in a
gcloud storage cp command to upload data to the bucket.
Solution: Delete the object that used compression transcoding and re-upload
it to the bucket.
One of the objects is named '.'—such an object is not synchronized to
schedulers and workers, and it might stop synchronizing at all.
Solution: Rename the object.
A folder and a DAG Python file have the same names, for example a.py.
In this case, the DAG file is not properly synchronized to Airflow
components.
Solution: Remove the folder that has the same name as the DAG Python file.
One of the objects in /dags or /plugins folders contains a / symbol
at the end of the object's name. Such objects can interfere with the
synchronization process because the / symbol means that an object is a
folder, not a file.
Solution: Remove the / symbol from the name of the problematic object.
Don't store unnecessary files in /dags and /plugins folders.
Sometimes DAGs and plugins that you implement come with additional files,
such as files that store tests for these components. These files are
synchronized to workers and schedulers and impact the time needed to
copy these files to schedulers, workers and web servers.
Solution: Don't store any additional and unnecessary files in /dags and
/plugins folders.
Done [Errno 21] Is a directory: '/home/airflow/gcs/dags/...' error is generated by schedulers and workers
This problem happens because objects can have
overlapping namespace in Cloud Storage, while at the same time
schedulers and workers use conventional linux file systems. For example, it is possible
to add both a folder and an object with the same name to an environment's
bucket. When the bucket is synced to the environment's schedulers and workers,
this error is generated, which can lead to task failures.
To fix this problem, make sure that there are no overlapping namespaces in the
environment's bucket. For example, if both /dags/misc (a file) and
/dags/misc/example_file.txt (another file) are in a bucket, an error is
generated by the scheduler.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-05 UTC."],[],[],null,["\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n[Cloud Composer 3](/composer/docs/composer-3/troubleshooting-cloud-storage \"View this page for Cloud Composer 3\") \\| **Cloud Composer 2** \\| Cloud Composer 1\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nCloud Composer synchronizes specific folders in your environment's bucket to Airflow components that run in your environment. See [Data stored in Cloud Storage](/composer/docs/composer-2/cloud-storage) for more information. This page refers to issues that could disrupt the synchronization process and how to troubleshoot them.\n\nCommon Issues\n\nThe following sections describe symptoms and potential fixes for some common file synchronization issues.\n\nHandling a large number of DAGs and plugins in dags and plugins folders\n\nContents of `/dags` and `/plugins` folders are synchronized from\nyour environment's bucket to local file systems of Airflow workers and\nschedulers.\n\nThe more data stored in these folders, the longer it takes to perform the\nsynchronization. To address such situations:\n\n- Limit the number of files in `/dags` and `/plugins` folders. Store only the\n minimum of required files.\n\n- Increase the disk space available to Airflow schedulers and workers.\n\n- Increase CPU and memory of Airflow schedulers and workers, so\n that the sync operation is performed faster.\n\n- In case of a very large number of DAGs, divide DAGs into batches, compress\n them into zip archives and deploy these archives into the `/dags` folder.\n This approach speeds up the DAGs syncing process. Airflow components\n extract zip archives before processing DAGs.\n\n- Generating DAGs in a programmatic way might also be a method for limiting\n the number of DAG files stored in the `/dags` folder.\n See the **Programmatic DAGs** section in the [DAGs Troubleshooting page](/composer/docs/composer-2/troubleshooting-dags#programmatic-dags) to avoid\n problems with scheduling and executing DAGs generated programmatically.\n\nAnti-patterns impacting DAGs and plugins syncing to schedulers, workers and web servers\n\nCloud Composer synchronizes the content of `/dags` and `/plugins`\nfolders to schedulers and workers. Certain objects in `/dags` and `/plugins`\nfolders might prevent this synchronization to work correctly or slow it down.\n\n- The `/dags` folder is synchronized to schedulers and workers.\n\n\n This folder is not synchronized to the web server.\n- The `/plugins` folder is synchronized to schedulers, workers and web servers.\n\nYou might encounter the following issues:\n\n- You uploaded gzip-compressed files that use\n \\[compression transcoding\\]\\[storage-transcoding\\] to `/dags` and `/plugins`\n folders. It usually happens if you use the `--gzip-local-all` flag in a\n `gcloud storage cp` command to upload data to the bucket.\n\n Solution: Delete the object that used compression transcoding and re-upload\n it to the bucket.\n- One of the objects is named '.'---such an object is not synchronized to\n schedulers and workers, and it might stop synchronizing at all.\n\n Solution: Rename the object.\n- A folder and a DAG Python file have the same names, for example `a.py`.\n In this case, the DAG file is not properly synchronized to Airflow\n components.\n\n Solution: Remove the folder that has the same name as the DAG Python file.\n- One of the objects in `/dags` or `/plugins` folders contains a `/` symbol\n at the end of the object's name. Such objects can interfere with the\n synchronization process because the `/` symbol means that an object is a\n folder, not a file.\n\n Solution: Remove the `/` symbol from the name of the problematic object.\n- Don't store unnecessary files in `/dags` and `/plugins` folders.\n\n Sometimes DAGs and plugins that you implement come with additional files,\n such as files that store tests for these components. These files are\n synchronized to workers and schedulers and impact the time needed to\n copy these files to schedulers, workers and web servers.\n\n Solution: Don't store any additional and unnecessary files in `/dags` and\n `/plugins` folders.\n\nDone \\[Errno 21\\] Is a directory: '/home/airflow/gcs/dags/...' error is generated by schedulers and workers\n\nThis problem happens because objects can have\noverlapping namespace in Cloud Storage, while at the same time\nschedulers and workers use conventional linux file systems. For example, it is possible\nto add both a folder and an object with the same name to an environment's\nbucket. When the bucket is synced to the environment's schedulers and workers,\nthis error is generated, which can lead to task failures.\n\nTo fix this problem, make sure that there are no overlapping namespaces in the\nenvironment's bucket. For example, if both `/dags/misc` (a file) and\n`/dags/misc/example_file.txt` (another file) are in a bucket, an error is\ngenerated by the scheduler.\n\nWhat's next\n\n- [Troubleshooting DAG Processor issues](/composer/docs/composer-2/troubleshooting-dag-processor#inspect-dag-processor-logs)\n- [Troubleshooting Airflow scheduler issues](/composer/docs/composer-2/troubleshooting-scheduling)\n- [Troubleshooting DAGs](/composer/docs/composer-2/troubleshooting-dags)"]]