Pada 15 September 2026, semua lingkungan Cloud Composer 1 dan Cloud Composer 2 versi 2.0.x akan mencapai akhir masa pakainya yang direncanakan, dan Anda tidak akan dapat menggunakannya. Sebaiknya rencanakan migrasi ke Cloud Composer 3.
Halaman ini menjelaskan cara mengelola database Airflow di lingkungan Anda.
Pembersihan database otomatis dengan kebijakan retensi database
Anda dapat mengonfigurasi pembersihan database otomatis untuk lingkungan Cloud Composer 3 dengan menetapkan kebijakan retensi database. Setelah Anda menyiapkan kebijakan ini, data yang lebih lama dari periode tertentu akan otomatis dihapus dari database Airflow setiap hari. Untuk informasi selengkapnya, lihat
Mengonfigurasi kebijakan retensi database.
Prosedur pembersihan yang tidak digunakan lagi
Sebelum kebijakan retensi database tersedia di
Cloud Composer, kami merekomendasikan pendekatan yang berbeda untuk mengotomatiskan
pembersihan database, melalui DAG pembersihan database. Pendekatan ini tidak digunakan lagi di Cloud Composer 3. DAG ini melakukan pekerjaan yang redundan dan Anda
dapat mengurangi konsumsi resource dengan menghapusnya dan menggantinya dengan
kebijakan retensi database.
Batas untuk ukuran database
Seiring waktu, database Airflow di lingkungan Anda akan menyimpan lebih banyak data. Data ini mencakup informasi dan log yang terkait dengan operasi DAG,
tugas, dan operasi Airflow lainnya sebelumnya.
Jika ukuran database Airflow lebih dari
20 GB,
Anda tidak dapat mengupgrade lingkungan ke versi yang lebih baru.
Jika ukuran database Airflow lebih dari 20 GB, Anda tidak dapat membuat snapshot.
Menjaga performa database
Masalah performa database Airflow dapat menyebabkan masalah eksekusi DAG secara keseluruhan.
Amati statistik Penggunaan CPU dan memori database. Jika penggunaan CPU dan memori mendekati batas, database akan kelebihan beban dan memerlukan penskalaan.
Jumlah resource yang tersedia untuk database Airflow dikontrol oleh
properti ukuran lingkungan di lingkungan Anda. Untuk menskalakan database, ubah ukuran lingkungan ke tingkat yang lebih besar. Meningkatkan
ukuran lingkungan akan meningkatkan biaya lingkungan Anda.
Jika Anda menggunakan mekanisme XCom untuk mentransfer file, pastikan Anda
menggunakannya sesuai dengan panduan Airflow.
Mentransfer file besar atau sejumlah besar file menggunakan XCom akan memengaruhi
performa database Airflow dan dapat menyebabkan kegagalan saat memuat
snapshot atau mengupgrade lingkungan Anda. Pertimbangkan untuk menggunakan alternatif seperti Cloud Storage untuk mentransfer data dalam volume besar.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-29 UTC."],[[["\u003cp\u003eThis page outlines how to manually clean up the Airflow database in Cloud Composer 3 environments, as well as automatic alternatives.\u003c/p\u003e\n"],["\u003cp\u003eCloud Composer offers a database retention policy that automatically removes records older than a specified period, and it is preferred over the older database cleanup DAG.\u003c/p\u003e\n"],["\u003cp\u003eExceeding a 20 GB database size in Airflow prevents environment upgrades and snapshot creation, making regular cleanup essential.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003egcloud composer environments run\u003c/code\u003e command can be used to manually trim the database, removing entries older than the specified retention period.\u003c/p\u003e\n"],["\u003cp\u003eDatabase performance issues can cause DAG execution problems, and scaling up the environment size can address these issues, while also using proper Xcom practices to avoid issues with the database.\u003c/p\u003e\n"]]],[],null,["# Clean up the Airflow database\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n**Cloud Composer 3** \\| [Cloud Composer 2](/composer/docs/composer-2/cleanup-airflow-database \"View this page for Cloud Composer 2\") \\| [Cloud Composer 1](/composer/docs/composer-1/cleanup-airflow-database \"View this page for Cloud Composer 1\")\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page explains how to maintain the Airflow database in your environment.\n\nAutomatic database cleanup with a database retention policy\n-----------------------------------------------------------\n\nYou can configure automatic database cleanup for your Cloud Composer 3\nenvironment by setting a database retention policy. After you set up this\npolicy, records older than a certain period are automatically removed from the\nAirflow database daily. For more information, see\n[Configure database retention policy](/composer/docs/composer-3/configure-db-retention).\n\nDeprecated cleanup procedures\n-----------------------------\n\n| **Deprecated:** The **maintenance DAG approach is obsolete** in Cloud Composer 3. If you still use the database maintenance DAG, we recommend to remove or pause it and configure a [database retention policy](/composer/docs/composer-3/configure-db-retention) instead.\n\nBefore the database retention policy became available in\nCloud Composer, we recommended a different approach for automating\nthe database cleanup, through a [database cleanup DAG](/composer/docs/composer-2/cleanup-airflow-database). This\napproach is obsolete in Cloud Composer 3. This DAG does redundant work and you\ncan reduce the resource consumption by removing it and replacing it with a\n[database retention policy](/composer/docs/composer-3/configure-db-retention).\n\nLimits for database size\n------------------------\n\nAs the time goes, the Airflow database of your environment stores more and\nmore data. This data includes information and logs related to past DAG runs,\ntasks, and other Airflow operations.\n\n- If the Airflow database size is more than\n 20 GB,\n then you can't upgrade your environment to a later version.\n\n- If the Airflow database size is more than 20 GB,\n it is not possible to create snapshots.\n\nMaintain database performance\n-----------------------------\n\n- Airflow database performance issues can lead to overall DAG execution\n issues.\n [Observe Database CPU and memory usage](/composer/docs/composer-3/use-monitoring-dashboard#db-statistics)\n statistics. If CPU and memory utilization approaches the limits, then the\n database is overloaded and requires scaling.\n\n The amount of resources available to the Airflow database is controlled by\n the environment size property of your environment. To scale the database up\n [change the environment size](/composer/docs/composer-3/scale-environments) to a larger tier. Increasing the\n environment size increases the costs of your environment.\n\n- If you use the XCom mechanism to transfer files, make sure that you\n [use it according to Airflow's guidelines](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/xcoms.html#object-storage-xcom-backend).\n Transferring big files or a large number of files using XCom impacts\n Airflow database's performance and can lead to failures when loading\n snapshots or upgrading your environment. Consider using alternatives such\n as Cloud Storage to transfer large volumes of data.\n\nRemove entries for unused DAGs\n------------------------------\n\nYou can remove database entries for unused DAGs by\n[removing DAGs from the Airflow UI](/composer/docs/composer-3/manage-dags#delete-md).\n\nWhat's next\n-----------\n\n- [Configure database retention policy](/composer/docs/composer-3/configure-db-retention)\n- [Access Airflow command-line interface](/composer/docs/composer-3/access-airflow-cli)"]]