Pada 15 September 2026, semua lingkungan Cloud Composer 1 dan Cloud Composer 2 versi 2.0.x akan mencapai akhir masa pakainya yang direncanakan, dan Anda tidak akan dapat menggunakannya. Sebaiknya rencanakan migrasi ke Cloud Composer 3.
Halaman ini menjelaskan cara mengaktifkan dukungan untuk Operator yang Dapat Ditunda di lingkungan Anda dan menggunakan operator Google Cloud yang dapat ditangguhkan di DAG Anda.
Tentang Operator yang Dapat Ditunda di Cloud Composer
Jika memiliki minimal satu instance pemicu (atau minimal dua di lingkungan yang sangat
tangguh), Anda dapat menggunakan
Operator dan Pemicu yang Dapat Ditunda
di DAG.
Untuk operator yang dapat ditangguhkan, Airflow membagi eksekusi tugas menjadi tahap berikut:
Mulai operasi. Pada tahap ini, tugas menempati slot pekerja
Airflow. Tugas ini melakukan operasi yang mendelegasikan tugas ke
layanan lain.
Misalnya, menjalankan tugas BigQuery dapat memerlukan waktu dari beberapa detik hingga beberapa jam. Setelah membuat tugas, operasi ini akan meneruskan ID tugas (ID tugas BigQuery) ke pemicu Airflow.
Pemicu memantau tugas hingga selesai. Pada tahap ini, slot pekerja tidak terisi. Pemicu Airflow memiliki arsitektur asinkron dan mampu menangani ratusan tugas tersebut. Saat
pemicu mendeteksi bahwa tugas selesai, pemicu akan mengirim peristiwa yang memicu
tahap terakhir.
Pada tahap terakhir, pekerja Airflow akan mengeksekusi callback. Misalnya, callback ini dapat menandai tugas sebagai berhasil, atau menjalankan operasi lain dan menetapkan tugas untuk dipantau lagi oleh pemicu.
Pemicu bersifat stateless sehingga tahan terhadap gangguan atau
mulai ulang. Oleh karena itu, tugas yang berjalan lama tahan terhadap mulai ulang pod,
kecuali jika mulai ulang terjadi selama tahap terakhir, yang diperkirakan akan berlangsung singkat.
Sebelum memulai
Operator dan Sensor yang Dapat Ditunda tersedia di lingkungan Cloud Composer 2 dan memerlukan hal berikut:
Cloud Composer 2.0.31 dan versi yang lebih baru
Airflow 2.2.5, 2.3.3, dan versi yang lebih baru
Mengaktifkan dukungan untuk operator yang dapat ditangguhkan
Komponen lingkungan yang disebut pemicu Airflow secara asinkron memantau semua
tugas yang ditangguhkan di lingkungan Anda. Setelah operasi yang ditangguhkan dari tugas tersebut
selesai, pemicu akan meneruskan tugas ke pekerja Airflow.
OperatorGoogle Cloud yang mendukung mode yang dapat ditangguhkan
Hanya beberapa operator Airflow yang telah diperluas untuk mendukung model yang dapat ditangguhkan.
Daftar berikut adalah referensi untuk operator dalam
paket airflow.providers.google.operators.cloud
yang mendukung mode yang dapat ditangguhkan.
Kolom dengan versi paket airflow.providers.google.operators.cloud minimum yang diperlukan mewakili versi paket paling awal tempat operator tersebut mendukung mode yang dapat ditangguhkan.
Operator Cloud Composer
Nama operator
Versi apache-airflow-providers-google yang diperlukan
Menggunakan operator yang dapat ditangguhkan di DAG
Konvensi umum untuk semua operator Google Cloud adalah mengaktifkan
mode yang dapat ditangguhkan dengan parameter boolean deferrable. Jika operator
Google Cloudtidak memiliki parameter ini, operator tersebut tidak dapat berjalan dalam mode
yang dapat ditangguhkan. Operator lain dapat memiliki konvensi yang berbeda. Misalnya, beberapa
operator komunitas memiliki class terpisah dengan akhiran Async dalam
namanya.
Contoh DAG berikut menggunakan operator DataprocSubmitJobOperator dalam
mode yang dapat ditangguhkan:
Pemicu menghasilkan log yang tersedia bersama dengan log komponen lingkungan
lainnya. Untuk mengetahui informasi selengkapnya tentang cara melihat log lingkungan, lihat Melihat log.
Pemicu monitor
Untuk informasi selengkapnya tentang cara memantau komponen pemicu, lihat
Metrik alur data.
Selain memantau pemicu, Anda dapat memeriksa jumlah tugas yang ditangguhkan dalam metrik Tugas yang Belum Selesai di dasbor Monitoring lingkungan Anda.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-29 UTC."],[[["\u003cp\u003eDeferrable Operators in Cloud Composer 2 allow for splitting task execution into stages, freeing up worker slots during long-running operations by using Airflow triggerers to monitor jobs.\u003c/p\u003e\n"],["\u003cp\u003eTo use deferrable operators, environments require at least one Airflow triggerer instance, which can be configured during environment creation or by adjusting an existing environment.\u003c/p\u003e\n"],["\u003cp\u003eDeferrable mode is enabled by using the boolean \u003ccode\u003edeferrable\u003c/code\u003e parameter in supported Google Cloud operators, with specific operators and required package versions listed for Cloud Composer, BigQuery, Cloud Build, Cloud SQL, Dataflow, Cloud Data Fusion, Dataproc, Google Kubernetes Engine, and AI Platform.\u003c/p\u003e\n"],["\u003cp\u003eThe Airflow triggerer is stateless, making long-running jobs resilient to interruptions and restarts, and you can monitor triggerer performance using Airflow metrics and view triggerer logs alongside other environment component logs.\u003c/p\u003e\n"],["\u003cp\u003eSupport for deferrable operators requires Cloud Composer 2.0.31 or later and Airflow 2.2.5, 2.3.3, or later versions, and triggerers are not synchronized with the dags or plugins folders.\u003c/p\u003e\n"]]],[],null,["# Use deferrable operators in Airflow DAGs\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n[Cloud Composer 3](/composer/docs/composer-3/use-deferrable-operators \"View this page for Cloud Composer 3\") \\| **Cloud Composer 2** \\| Cloud Composer 1\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page explains how to enable support for Deferrable Operators in your\nenvironment and use deferrable Google Cloud operators in your DAGs.\n\nAbout Deferrable Operators in Cloud Composer\n--------------------------------------------\n\nIf you have at least one triggerer instance (or at least two in\n[highly resilient environments](/composer/docs/composer-2/set-up-highly-resilient-environments)), you can use\n[Deferrable Operators and Triggers](https://airflow.apache.org/docs/apache-airflow/stable/concepts/deferring.html) in your DAGs.\n\nFor deferrable operators, Airflow splits task execution into the following stages:\n\n1. Start the operation. In this stage, the task occupies an Airflow worker\n slot. The task performs an operation that delegates the job to a\n different service.\n\n For example, running a BigQuery job can take from a few\n seconds to several hours. After creating the job, the operation\n passes the work identifier (BigQuery job ID) to an\n Airflow trigger.\n2. The trigger monitors the job until it finishes. In this stage, a\n worker slot is not occupied. The Airflow triggerer has asynchronous\n architecture and is capable of handling hundreds of such jobs. When the\n trigger detects that the job is finished, it sends an event that triggers\n the last stage.\n\n3. In the last stage, an Airflow worker executes a callback. This callback, for\n example, can mark the task as successful, or execute another operation and\n set the job to be monitored by the triggerer again.\n\nThe triggerer is stateless and therefore resilient to interruptions or\nrestarts. Because of this, long-running jobs are resilient to pod restarts,\nunless the restart happens during the last stage, which is expected to be short.\n\nBefore you begin\n----------------\n\n- In Cloud Composer 2, Deferrable Operators and Sensors require the following:\n\n - Cloud Composer version 2.0.31 and later versions\n - Airflow 2.2.5, 2.3.3, and later versions\n\n| **Caution:** Terraform provider support for Airflow triggerers is in Preview. Use the `google-beta` Terraform provider when changing scale and performance parameters of your environment, even if these changes are not related to the triggerer. For example, when changing parameters for Airflow workers.\n\nEnable support for deferrable operators\n---------------------------------------\n\nAn environment component called *Airflow triggerer* asynchronously monitors all\ndeferred tasks in your environment. After a deferred operation from such a task\nis completed, triggerer passes the task to an Airflow worker.\n\nYou need at least one triggerer instance in your environment (or at least two\nin highly resilient environments) to use deferrable mode in your DAGs.\nYou can configure the triggerers\n[when you create an environment](/composer/docs/composer-2/create-environments#scale-and-performance) or\n[adjust the number of triggerers and performance parameters for an existing environment](/composer/docs/composer-2/scale-environments#triggerer-parameters).\n\n### Google Cloud operators that support deferrable mode\n\nOnly some Airflow operators have been extended to support the deferrable model.\nThe following list is a reference for the operators in the\n[`airflow.providers.google.operators.cloud`](https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/#)\npackage that support the deferrable mode.\nThe column with the minimum required `airflow.providers.google.operators.cloud`\npackage version represents the earliest package version where that operator\nsupports deferrable mode.\n| **Important:** The following tables **don't list every operator that supports deferrable mode** . Many other operators from `airflow.providers.google.operators.cloud` and other provider packages are supported by Airflow and Cloud Composer. These operators are not described on this page. To check if a particular operator supports deferrable mode, see its [Provider package documentation](https://airflow.apache.org/docs/#providers-packages-docs-apache-airflow-providers-index-html) provided by Airflow.\n| **Note:** BigQuery operators in defferable mode **fail if the\nlocation is not set to US** . This happens because of [a bug](https://github.com/apache/airflow/issues/29307) in a dependency of Airflow, not in BigQuery operators. \n\n#### Cloud Composer operators\n\n#### BigQuery operators\n\n#### BigQuery Data Transfer Service operators\n\n#### Cloud Build operators\n\n#### Cloud SQL operators\n\n#### Dataflow operators\n\n#### Cloud Data Fusion operators\n\n#### Dataproc operators\n\n#### Google Kubernetes Engine operators\n\n#### AI Platform operators\n\nUse deferrable operators in your DAGs\n-------------------------------------\n\nA common convention for all Google Cloud operators is to enable the\ndeferrable mode with the `deferrable` boolean parameter. If a Google Cloud\noperator does not have this parameter, then it cannot run in the deferrable\nmode. Other operators can have a different convention. For example, some\ncommunity operators have a separate class with the `Async` suffix in the\nname.\n| **Important:** The `dags/` and `/plugins` folders from your environment bucket are not synchronized to the triggerer. You can use triggers that are installed with [PyPI packages](/composer/docs/composer-2/install-python-dependencies), or included in a preinstalled provider package.\n\nThe following example DAG uses `DataprocSubmitJobOperator` operator in the\ndeferrable mode: \n\n PYSPARK_JOB = {\n \"reference\": { \"project_id\": \"\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e\" },\n \"placement\": { \"cluster_name\": \"\u003cvar translate=\"no\"\u003ePYSPARK_CLUSTER_NAME\u003c/var\u003e\" },\n \"pyspark_job\": {\n \"main_python_file_uri\": \"gs://dataproc-examples/pyspark/hello-world/hello-world.py\"\n },\n }\n\n DataprocSubmitJobOperator(\n task_id=\"dataproc-deferrable-example\",\n job=PYSPARK_JOB,\n deferrable=True,\n )\n\nView triggerer logs\n-------------------\n\nThe triggerer generates logs that are available together with logs of other\nenvironment components. For more information about viewing your environment\nlogs, see [View logs](/composer/docs/composer-2/view-logs#streaming).\n\nMonitor triggerer\n-----------------\n\nFor more information about monitoring the triggerer component, see\n[Airflow metrics](/composer/docs/composer-2/monitor-environments#airflow-metrics).\n\nIn addition to monitoring the triggerer, you can check the number of deferred\ntasks in the **Unfinished Task** metrics on the Monitoring dashboard of your\nenvironment.\n\nWhat's next\n-----------\n\n- [Troubleshooting Airflow triggerer issues](/composer/docs/composer-2/troubleshooting-triggerer)\n- [Airflow triggerer metrics](/composer/docs/composer-2/monitor-environments#airflow-metrics)\n- [Airflow triggerer logs](/composer/docs/composer-2/view-logs#streaming)"]]