Il 15 settembre 2026, tutti gli ambienti Cloud Composer 1 e Cloud Composer 2 versione 2.0.x raggiungeranno la fine del ciclo di vita pianificata e non potrai più utilizzarli. Ti consigliamo di pianificare la migrazione a Cloud Composer 3.
Questa pagina spiega come attivare il supporto degli operatori differibili nel tuo ambiente e utilizzare gli operatori Google Cloud differibili nei tuoi DAG.
Informazioni sugli operatori differibili in Cloud Composer
Se hai almeno un'istanza di attivatore (o almeno due in ambienti altamente resilienti), puoi utilizzare
attivatori e operatori differibili
nei tuoi DAG.
Per gli operatori differibili, Airflow suddivide l'esecuzione delle attività nelle seguenti fasi:
Avvia l'operazione. In questa fase, l'attività occupa uno slot del worker Airflow. L'attività esegue un'operazione che delega il job a un servizio diverso.
Ad esempio, l'esecuzione di un job BigQuery può richiedere da alcuni secondi a diverse ore. Dopo aver creato il job, l'operazione
trasmette l'identificatore del lavoro (ID job BigQuery) a un
attivatore Airflow.
L'attivatore monitora il job fino al completamento. In questa fase, uno slot di worker non è occupato. Il trigger di Airflow ha un'architettura asincrona ed è in grado di gestire centinaia di questi job. Quando l'attivatore rileva che il job è stato completato, invia un evento che attiva l'ultima fase.
Nell'ultima fase, un worker Airflow esegue un callback. Ad esempio, questo callback può contrassegnare l'attività come completata o eseguire un'altra operazione e impostare nuovamente il job da monitorare dall'attivatore.
L'attivatore è senza stato e quindi è resiliente alle interruzioni o ai riavvii. Per questo motivo, i job di lunga durata sono resilienti ai riavvii dei pod,
a meno che il riavvio non avvenga durante l'ultima fase, che dovrebbe essere breve.
Prima di iniziare
Gli operatori e i sensori differibili sono disponibili negli ambienti Cloud Composer 2 e richiedono quanto segue:
Cloud Composer 2.0.31 e versioni successive
Airflow 2.2.5, 2.3.3 e versioni successive
Attivare il supporto degli operatori differibili
Un componente dell'ambiente chiamato triggerer di Airflow monitora in modo asincrono tutte le attività differite nel tuo ambiente. Una volta completata un'operazione differita da un'attività di questo tipo, l'attivatore la passa a un worker di Airflow.
OperatoriGoogle Cloud che supportano la modalità differibile
Solo alcuni operatori Airflow sono stati estesi per supportare il modello differibile.
L'elenco seguente è un riferimento per gli operatori nel
package airflow.providers.google.operators.cloud
che supportano la modalità differibile.
La colonna con la versione minima richiesta del pacchetto airflow.providers.google.operators.cloud rappresenta la versione del pacchetto precedente in cui l'operatore supporta la modalità differibile.
Operatori di Cloud Composer
Nome dell'operatore
Versione apache-airflow-providers-google richiesta
Una convenzione comune per tutti gli Google Cloud operatori è attivare la modalità posticipabile con il parametro booleano deferrable. Se un operatore Google Cloud
non ha questo parametro, non può essere eseguito in modalità posticipabile. Altri operatori possono avere una convenzione diversa. Ad esempio, alcuni operatori della community hanno una classe separata con il suffisso Async nel nome.
Il seguente DAG di esempio utilizza l'operatore DataprocSubmitJobOperator in modalità posticipabile:
L'attivatore genera log disponibili insieme ai log di altri componenti dell'ambiente. Per ulteriori informazioni su come visualizzare i log dell'ambiente, vedi Visualizzare i log.
Attivatore del monitoraggio
Per ulteriori informazioni sul monitoraggio del componente attivatore, consulta
Metriche di Airflow.
Oltre a monitorare l'attivatore, puoi controllare il numero di attività posticipate nelle metriche Attività incompiuta nella dashboard di monitoraggio del tuo ambiente.
[[["Facile da capire","easyToUnderstand","thumb-up"],["Il problema è stato risolto","solvedMyProblem","thumb-up"],["Altra","otherUp","thumb-up"]],[["Difficile da capire","hardToUnderstand","thumb-down"],["Informazioni o codice di esempio errati","incorrectInformationOrSampleCode","thumb-down"],["Mancano le informazioni o gli esempi di cui ho bisogno","missingTheInformationSamplesINeed","thumb-down"],["Problema di traduzione","translationIssue","thumb-down"],["Altra","otherDown","thumb-down"]],["Ultimo aggiornamento 2025-08-29 UTC."],[[["\u003cp\u003eDeferrable Operators in Cloud Composer 2 allow for splitting task execution into stages, freeing up worker slots during long-running operations by using Airflow triggerers to monitor jobs.\u003c/p\u003e\n"],["\u003cp\u003eTo use deferrable operators, environments require at least one Airflow triggerer instance, which can be configured during environment creation or by adjusting an existing environment.\u003c/p\u003e\n"],["\u003cp\u003eDeferrable mode is enabled by using the boolean \u003ccode\u003edeferrable\u003c/code\u003e parameter in supported Google Cloud operators, with specific operators and required package versions listed for Cloud Composer, BigQuery, Cloud Build, Cloud SQL, Dataflow, Cloud Data Fusion, Dataproc, Google Kubernetes Engine, and AI Platform.\u003c/p\u003e\n"],["\u003cp\u003eThe Airflow triggerer is stateless, making long-running jobs resilient to interruptions and restarts, and you can monitor triggerer performance using Airflow metrics and view triggerer logs alongside other environment component logs.\u003c/p\u003e\n"],["\u003cp\u003eSupport for deferrable operators requires Cloud Composer 2.0.31 or later and Airflow 2.2.5, 2.3.3, or later versions, and triggerers are not synchronized with the dags or plugins folders.\u003c/p\u003e\n"]]],[],null,["# Use deferrable operators in Airflow DAGs\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n[Cloud Composer 3](/composer/docs/composer-3/use-deferrable-operators \"View this page for Cloud Composer 3\") \\| **Cloud Composer 2** \\| Cloud Composer 1\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page explains how to enable support for Deferrable Operators in your\nenvironment and use deferrable Google Cloud operators in your DAGs.\n\nAbout Deferrable Operators in Cloud Composer\n--------------------------------------------\n\nIf you have at least one triggerer instance (or at least two in\n[highly resilient environments](/composer/docs/composer-2/set-up-highly-resilient-environments)), you can use\n[Deferrable Operators and Triggers](https://airflow.apache.org/docs/apache-airflow/stable/concepts/deferring.html) in your DAGs.\n\nFor deferrable operators, Airflow splits task execution into the following stages:\n\n1. Start the operation. In this stage, the task occupies an Airflow worker\n slot. The task performs an operation that delegates the job to a\n different service.\n\n For example, running a BigQuery job can take from a few\n seconds to several hours. After creating the job, the operation\n passes the work identifier (BigQuery job ID) to an\n Airflow trigger.\n2. The trigger monitors the job until it finishes. In this stage, a\n worker slot is not occupied. The Airflow triggerer has asynchronous\n architecture and is capable of handling hundreds of such jobs. When the\n trigger detects that the job is finished, it sends an event that triggers\n the last stage.\n\n3. In the last stage, an Airflow worker executes a callback. This callback, for\n example, can mark the task as successful, or execute another operation and\n set the job to be monitored by the triggerer again.\n\nThe triggerer is stateless and therefore resilient to interruptions or\nrestarts. Because of this, long-running jobs are resilient to pod restarts,\nunless the restart happens during the last stage, which is expected to be short.\n\nBefore you begin\n----------------\n\n- In Cloud Composer 2, Deferrable Operators and Sensors require the following:\n\n - Cloud Composer version 2.0.31 and later versions\n - Airflow 2.2.5, 2.3.3, and later versions\n\n| **Caution:** Terraform provider support for Airflow triggerers is in Preview. Use the `google-beta` Terraform provider when changing scale and performance parameters of your environment, even if these changes are not related to the triggerer. For example, when changing parameters for Airflow workers.\n\nEnable support for deferrable operators\n---------------------------------------\n\nAn environment component called *Airflow triggerer* asynchronously monitors all\ndeferred tasks in your environment. After a deferred operation from such a task\nis completed, triggerer passes the task to an Airflow worker.\n\nYou need at least one triggerer instance in your environment (or at least two\nin highly resilient environments) to use deferrable mode in your DAGs.\nYou can configure the triggerers\n[when you create an environment](/composer/docs/composer-2/create-environments#scale-and-performance) or\n[adjust the number of triggerers and performance parameters for an existing environment](/composer/docs/composer-2/scale-environments#triggerer-parameters).\n\n### Google Cloud operators that support deferrable mode\n\nOnly some Airflow operators have been extended to support the deferrable model.\nThe following list is a reference for the operators in the\n[`airflow.providers.google.operators.cloud`](https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/#)\npackage that support the deferrable mode.\nThe column with the minimum required `airflow.providers.google.operators.cloud`\npackage version represents the earliest package version where that operator\nsupports deferrable mode.\n| **Important:** The following tables **don't list every operator that supports deferrable mode** . Many other operators from `airflow.providers.google.operators.cloud` and other provider packages are supported by Airflow and Cloud Composer. These operators are not described on this page. To check if a particular operator supports deferrable mode, see its [Provider package documentation](https://airflow.apache.org/docs/#providers-packages-docs-apache-airflow-providers-index-html) provided by Airflow.\n| **Note:** BigQuery operators in defferable mode **fail if the\nlocation is not set to US** . This happens because of [a bug](https://github.com/apache/airflow/issues/29307) in a dependency of Airflow, not in BigQuery operators. \n\n#### Cloud Composer operators\n\n#### BigQuery operators\n\n#### BigQuery Data Transfer Service operators\n\n#### Cloud Build operators\n\n#### Cloud SQL operators\n\n#### Dataflow operators\n\n#### Cloud Data Fusion operators\n\n#### Dataproc operators\n\n#### Google Kubernetes Engine operators\n\n#### AI Platform operators\n\nUse deferrable operators in your DAGs\n-------------------------------------\n\nA common convention for all Google Cloud operators is to enable the\ndeferrable mode with the `deferrable` boolean parameter. If a Google Cloud\noperator does not have this parameter, then it cannot run in the deferrable\nmode. Other operators can have a different convention. For example, some\ncommunity operators have a separate class with the `Async` suffix in the\nname.\n| **Important:** The `dags/` and `/plugins` folders from your environment bucket are not synchronized to the triggerer. You can use triggers that are installed with [PyPI packages](/composer/docs/composer-2/install-python-dependencies), or included in a preinstalled provider package.\n\nThe following example DAG uses `DataprocSubmitJobOperator` operator in the\ndeferrable mode: \n\n PYSPARK_JOB = {\n \"reference\": { \"project_id\": \"\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e\" },\n \"placement\": { \"cluster_name\": \"\u003cvar translate=\"no\"\u003ePYSPARK_CLUSTER_NAME\u003c/var\u003e\" },\n \"pyspark_job\": {\n \"main_python_file_uri\": \"gs://dataproc-examples/pyspark/hello-world/hello-world.py\"\n },\n }\n\n DataprocSubmitJobOperator(\n task_id=\"dataproc-deferrable-example\",\n job=PYSPARK_JOB,\n deferrable=True,\n )\n\nView triggerer logs\n-------------------\n\nThe triggerer generates logs that are available together with logs of other\nenvironment components. For more information about viewing your environment\nlogs, see [View logs](/composer/docs/composer-2/view-logs#streaming).\n\nMonitor triggerer\n-----------------\n\nFor more information about monitoring the triggerer component, see\n[Airflow metrics](/composer/docs/composer-2/monitor-environments#airflow-metrics).\n\nIn addition to monitoring the triggerer, you can check the number of deferred\ntasks in the **Unfinished Task** metrics on the Monitoring dashboard of your\nenvironment.\n\nWhat's next\n-----------\n\n- [Troubleshooting Airflow triggerer issues](/composer/docs/composer-2/troubleshooting-triggerer)\n- [Airflow triggerer metrics](/composer/docs/composer-2/monitor-environments#airflow-metrics)\n- [Airflow triggerer logs](/composer/docs/composer-2/view-logs#streaming)"]]