On September 15, 2026, all Cloud Composer 1 versions and versions 2.0.x of Cloud Composer 2 will reach their planned end of life. You will not be able to use environments with these versions. We recommend planning migration to Cloud Composer 3. Cloud Composer 2 versions 2.1.x and later are still supported and are not impacted by this change.
For deferrable operators, Airflow splits task execution into the following stages:
Start the operation. In this stage, the task occupies an Airflow worker
slot. The task performs an operation that delegates the job to a
different service.
For example, running a BigQuery job can take from a few
seconds to several hours. After creating the job, the operation
passes the work identifier (BigQuery job ID) to an
Airflow trigger.
The trigger monitors the job until it finishes. In this stage, a
worker slot is not occupied. The Airflow triggerer has asynchronous
architecture and is capable of handling hundreds of such jobs. When the
trigger detects that the job is finished, it sends an event that triggers
the last stage.
In the last stage, an Airflow worker executes a callback. This callback, for
example, can mark the task as successful, or execute another operation and
set the job to be monitored by the triggerer again.
The triggerer is stateless and therefore resilient to interruptions or
restarts. Because of this, long-running jobs are resilient to pod restarts,
unless the restart happens during the last stage, which is expected to be short.
Before you begin
Enable support for deferrable operators
An environment component called Airflow triggerer asynchronously monitors all
deferred tasks in your environment. After a deferred operation from such a task
is completed, triggerer passes the task to an Airflow worker.
Google Cloud operators that support deferrable mode
Only some Airflow operators have been extended to support the deferrable model.
The following list is a reference for the operators in the
airflow.providers.google.operators.cloud
package that support the deferrable mode.
The column with the minimum required airflow.providers.google.operators.cloud
package version represents the earliest package version where that operator
supports deferrable mode.
A common convention for all Google Cloud operators is to enable the
deferrable mode with the deferrable boolean parameter. If a Google Cloud
operator does not have this parameter, then it cannot run in the deferrable
mode. Other operators can have a different convention. For example, some
community operators have a separate class with the Async suffix in the
name.
The following example DAG uses DataprocSubmitJobOperator operator in the
deferrable mode:
The triggerer generates logs that are available together with logs of other
environment components. For more information about viewing your environment
logs, see View logs.
Monitor triggerer
For more information about monitoring the triggerer component, see
Airflow metrics.
In addition to monitoring the triggerer, you can check the number of deferred
tasks in the Unfinished Task metrics on the Monitoring dashboard of your
environment.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-25 UTC."],[[["\u003cp\u003eThis page explains how to use Deferrable Operators and Triggers in Cloud Composer 2 environments, which require at least one triggerer instance, or two for high resilience.\u003c/p\u003e\n"],["\u003cp\u003eDeferrable operators split task execution into three stages: starting the operation, monitoring the job via a trigger, and executing a callback upon job completion by a worker.\u003c/p\u003e\n"],["\u003cp\u003eTo enable deferrable mode in DAGs, you need to configure Airflow triggerer instances, which asynchronously monitor deferred tasks and pass them to Airflow workers upon completion.\u003c/p\u003e\n"],["\u003cp\u003eSeveral Google Cloud operators across various services, such as BigQuery, Cloud Build, and Dataproc, support deferrable mode, typically enabled via a \u003ccode\u003edeferrable\u003c/code\u003e boolean parameter within the operator.\u003c/p\u003e\n"],["\u003cp\u003eTriggerer logs and metrics can be monitored, and you can check for deferred tasks via the "Unfinished Task" metric.\u003c/p\u003e\n"]]],[],null,["# Use deferrable operators in Airflow DAGs\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n**Cloud Composer 3** \\| [Cloud Composer 2](/composer/docs/composer-2/use-deferrable-operators \"View this page for Cloud Composer 2\") \\| Cloud Composer 1\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page explains how to enable support for Deferrable Operators in your\nenvironment and use deferrable Google Cloud operators in your DAGs.\n\nAbout Deferrable Operators in Cloud Composer\n--------------------------------------------\n\nIf you have at least one triggerer instance (or at least two in\n[highly resilient environments](/composer/docs/composer-3/set-up-highly-resilient-environments)), you can use\n[Deferrable Operators and Triggers](https://airflow.apache.org/docs/apache-airflow/stable/concepts/deferring.html) in your DAGs.\n\nFor deferrable operators, Airflow splits task execution into the following stages:\n\n1. Start the operation. In this stage, the task occupies an Airflow worker\n slot. The task performs an operation that delegates the job to a\n different service.\n\n For example, running a BigQuery job can take from a few\n seconds to several hours. After creating the job, the operation\n passes the work identifier (BigQuery job ID) to an\n Airflow trigger.\n2. The trigger monitors the job until it finishes. In this stage, a\n worker slot is not occupied. The Airflow triggerer has asynchronous\n architecture and is capable of handling hundreds of such jobs. When the\n trigger detects that the job is finished, it sends an event that triggers\n the last stage.\n\n3. In the last stage, an Airflow worker executes a callback. This callback, for\n example, can mark the task as successful, or execute another operation and\n set the job to be monitored by the triggerer again.\n\nThe triggerer is stateless and therefore resilient to interruptions or\nrestarts. Because of this, long-running jobs are resilient to pod restarts,\nunless the restart happens during the last stage, which is expected to be short.\n\nBefore you begin\n----------------\n\n| **Caution:** Terraform provider support for Airflow triggerers is in Preview. Use the `google-beta` Terraform provider when changing scale and performance parameters of your environment, even if these changes are not related to the triggerer. For example, when changing parameters for Airflow workers.\n\nEnable support for deferrable operators\n---------------------------------------\n\nAn environment component called *Airflow triggerer* asynchronously monitors all\ndeferred tasks in your environment. After a deferred operation from such a task\nis completed, triggerer passes the task to an Airflow worker.\n\nYou need at least one triggerer instance in your environment (or at least two\nin highly resilient environments) to use deferrable mode in your DAGs.\nYou can configure the triggerers\n[when you create an environment](/composer/docs/composer-3/create-environments#scale-and-performance) or\n[adjust the number of triggerers and performance parameters for an existing environment](/composer/docs/composer-3/scale-environments#triggerer-parameters).\n\n### Google Cloud operators that support deferrable mode\n\nOnly some Airflow operators have been extended to support the deferrable model.\nThe following list is a reference for the operators in the\n[`airflow.providers.google.operators.cloud`](https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/#)\npackage that support the deferrable mode.\nThe column with the minimum required `airflow.providers.google.operators.cloud`\npackage version represents the earliest package version where that operator\nsupports deferrable mode.\n| **Important:** The following tables **don't list every operator that supports deferrable mode** . Many other operators from `airflow.providers.google.operators.cloud` and other provider packages are supported by Airflow and Cloud Composer. These operators are not described on this page. To check if a particular operator supports deferrable mode, see its [Provider package documentation](https://airflow.apache.org/docs/#providers-packages-docs-apache-airflow-providers-index-html) provided by Airflow.\n| **Note:** BigQuery operators in defferable mode **fail if the\nlocation is not set to US** . This happens because of [a bug](https://github.com/apache/airflow/issues/29307) in a dependency of Airflow, not in BigQuery operators. \n\n#### Cloud Composer operators\n\n#### BigQuery operators\n\n#### BigQuery Data Transfer Service operators\n\n#### Cloud Build operators\n\n#### Cloud SQL operators\n\n#### Dataflow operators\n\n#### Cloud Data Fusion operators\n\n#### Dataproc operators\n\n#### Google Kubernetes Engine operators\n\n#### AI Platform operators\n\nUse deferrable operators in your DAGs\n-------------------------------------\n\nA common convention for all Google Cloud operators is to enable the\ndeferrable mode with the `deferrable` boolean parameter. If a Google Cloud\noperator does not have this parameter, then it cannot run in the deferrable\nmode. Other operators can have a different convention. For example, some\ncommunity operators have a separate class with the `Async` suffix in the\nname.\n| **Important:** The `dags/` and `/plugins` folders from your environment bucket are not synchronized to the triggerer. You can use triggers that are installed with [PyPI packages](/composer/docs/composer-3/install-python-dependencies), or included in a preinstalled provider package.\n\nThe following example DAG uses `DataprocSubmitJobOperator` operator in the\ndeferrable mode: \n\n PYSPARK_JOB = {\n \"reference\": { \"project_id\": \"\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e\" },\n \"placement\": { \"cluster_name\": \"\u003cvar translate=\"no\"\u003ePYSPARK_CLUSTER_NAME\u003c/var\u003e\" },\n \"pyspark_job\": {\n \"main_python_file_uri\": \"gs://dataproc-examples/pyspark/hello-world/hello-world.py\"\n },\n }\n\n DataprocSubmitJobOperator(\n task_id=\"dataproc-deferrable-example\",\n job=PYSPARK_JOB,\n deferrable=True,\n )\n\nView triggerer logs\n-------------------\n\nThe triggerer generates logs that are available together with logs of other\nenvironment components. For more information about viewing your environment\nlogs, see [View logs](/composer/docs/composer-3/view-logs#streaming).\n\nMonitor triggerer\n-----------------\n\nFor more information about monitoring the triggerer component, see\n[Airflow metrics](/composer/docs/composer-3/monitor-environments#airflow-metrics).\n\nIn addition to monitoring the triggerer, you can check the number of deferred\ntasks in the **Unfinished Task** metrics on the Monitoring dashboard of your\nenvironment.\n\nWhat's next\n-----------\n\n- [Troubleshooting Airflow triggerer issues](/composer/docs/composer-3/troubleshooting-triggerer)\n- [Airflow triggerer metrics](/composer/docs/composer-3/monitor-environments#airflow-metrics)\n- [Airflow triggerer logs](/composer/docs/composer-3/view-logs#streaming)"]]