El 15 de septiembre del 2026, todos los entornos de Cloud Composer 1 y Cloud Composer 2 versión 2.0.x alcanzarán el final de su ciclo de vida previsto y no podrás usarlos. Te recomendamos que planifiques la migración a Cloud Composer 3.
En esta página se describe cómo puedes agrupar tareas en tus pipelines de Airflow
mediante los siguientes patrones de diseño:
Agrupar tareas en el gráfico de DAG.
Activar DAGs secundarios desde un DAG principal.
Agrupar tareas con el operador TaskGroup.
Agrupar tareas en el gráfico de DAG
Para agrupar tareas en determinadas fases de tu flujo de trabajo, puedes usar relaciones entre las tareas de tu archivo DAG.
Veamos un ejemplo:
Imagen 1. Las tareas se pueden agrupar en un DAG de Airflow (haz clic para ampliar)
En este flujo de trabajo, las tareas op-1 y op-2 se ejecutan juntas después de la tarea inicial start. Para ello, puedes agrupar las tareas con la instrucción
start >> [task_1, task_2].
En el siguiente ejemplo se muestra una implementación completa de este DAG:
fromairflowimportDAGfromairflow.operators.bashimportBashOperatorfromairflow.operators.dummyimportDummyOperatorfromairflow.utils.datesimportdays_agoDAG_NAME="all_tasks_in_one_dag"args={"owner":"airflow","start_date":days_ago(1),"schedule_interval":"@once"}withDAG(dag_id=DAG_NAME,default_args=args)asdag:start=DummyOperator(task_id="start")task_1=BashOperator(task_id="op-1",bash_command=":",dag=dag)task_2=BashOperator(task_id="op-2",bash_command=":",dag=dag)some_other_task=DummyOperator(task_id="some-other-task")task_3=BashOperator(task_id="op-3",bash_command=":",dag=dag)task_4=BashOperator(task_id="op-4",bash_command=":",dag=dag)end=DummyOperator(task_id="end")start >> [task_1,task_2] >> some_other_task >> [task_3,task_4] >> end
Imagen 2. Los DAGs se pueden activar desde otro DAG con el operador TriggerDagRunOperator (haz clic para ampliar)
En este flujo de trabajo, los bloques dag_1 y dag_2 representan una serie de tareas que se agrupan en un DAG independiente en el entorno de Cloud Composer.
Para implementar este flujo de trabajo, se necesitan dos archivos DAG independientes.
El archivo DAG de control tiene el siguiente aspecto:
fromairflowimportDAGfromairflow.operators.dummyimportDummyOperatorfromairflow.operators.trigger_dagrunimportTriggerDagRunOperatorfromairflow.utils.datesimportdays_agowithDAG(dag_id="controller_dag_to_trigger_other_dags",default_args={"owner":"airflow"},start_date=days_ago(1),schedule_interval="@once",)asdag:start=DummyOperator(task_id="start")trigger_1=TriggerDagRunOperator(task_id="dag_1",trigger_dag_id="dag-to-trigger",# Ensure this equals the dag_id of the DAG to triggerconf={"message":"Hello World"},)trigger_2=TriggerDagRunOperator(task_id="dag_2",trigger_dag_id="dag-to-trigger",# Ensure this equals the dag_id of the DAG to triggerconf={"message":"Hello World"},)some_other_task=DummyOperator(task_id="some-other-task")end=DummyOperator(task_id="end")start >> trigger_1 >> some_other_task >> trigger_2 >> end
La implementación del DAG secundario, que se activa mediante el DAG de control, es la siguiente:
Puedes usar el operador TaskGroup para agrupar tareas en tu DAG. Las tareas definidas en un bloque TaskGroup siguen formando parte del DAG principal.
Veamos un ejemplo:
Imagen 3. Las tareas se pueden agrupar visualmente en la interfaz de usuario con el operador TaskGroup (haz clic para ampliar).
Las tareas op-1 y op-2 se agrupan en un bloque con el ID taskgroup_1. Una implementación de este flujo de trabajo sería como el siguiente código:
fromairflow.models.dagimportDAGfromairflow.operators.bashimportBashOperatorfromairflow.operators.dummyimportDummyOperatorfromairflow.utils.datesimportdays_agofromairflow.utils.task_groupimportTaskGroupwithDAG(dag_id="taskgroup_example",start_date=days_ago(1))asdag:start=DummyOperator(task_id="start")withTaskGroup("taskgroup_1",tooltip="task group #1")assection_1:task_1=BashOperator(task_id="op-1",bash_command=":")task_2=BashOperator(task_id="op-2",bash_command=":")withTaskGroup("taskgroup_2",tooltip="task group #2")assection_2:task_3=BashOperator(task_id="op-3",bash_command=":")task_4=BashOperator(task_id="op-4",bash_command=":")some_other_task=DummyOperator(task_id="some-other-task")end=DummyOperator(task_id="end")start >> section_1 >> some_other_task >> section_2 >> end
[[["Es fácil de entender","easyToUnderstand","thumb-up"],["Me ofreció una solución al problema","solvedMyProblem","thumb-up"],["Otro","otherUp","thumb-up"]],[["Es difícil de entender","hardToUnderstand","thumb-down"],["La información o el código de muestra no son correctos","incorrectInformationOrSampleCode","thumb-down"],["Me faltan las muestras o la información que necesito","missingTheInformationSamplesINeed","thumb-down"],["Problema de traducción","translationIssue","thumb-down"],["Otro","otherDown","thumb-down"]],["Última actualización: 2025-08-29 (UTC)."],[[["\u003cp\u003eThis document outlines methods for grouping tasks within Airflow pipelines, covering approaches like structuring relationships in the DAG graph, triggering child DAGs from a parent DAG, and utilizing the \u003ccode\u003eTaskGroup\u003c/code\u003e operator.\u003c/p\u003e\n"],["\u003cp\u003eGrouping tasks directly in the DAG graph is achieved by defining relationships between tasks, demonstrated with the syntax \u003ccode\u003estart >> [task_1, task_2]\u003c/code\u003e, which executes \u003ccode\u003etask_1\u003c/code\u003e and \u003ccode\u003etask_2\u003c/code\u003e concurrently after \u003ccode\u003estart\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eParent DAGs can trigger child DAGs using the \u003ccode\u003eTriggerDagRunOperator\u003c/code\u003e, requiring the \u003ccode\u003etrigger_dag_id\u003c/code\u003e to match the \u003ccode\u003edag_id\u003c/code\u003e of the child DAG.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eTaskGroup\u003c/code\u003e operator allows for grouping tasks within a DAG, which provides a visual organization in the Airflow UI and simplifies complex workflows.\u003c/p\u003e\n"],["\u003cp\u003eIt is recommended to avoid using SubDAGs for grouping tasks due to performance and functional issues; the document presents superior alternative methods for structuring workflows.\u003c/p\u003e\n"]]],[],null,["# Group tasks inside DAGs\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n[Cloud Composer 3](/composer/docs/composer-3/group-tasks-inside-dags \"View this page for Cloud Composer 3\") \\| **Cloud Composer 2** \\| [Cloud Composer 1](/composer/docs/composer-1/group-tasks-inside-dags \"View this page for Cloud Composer 1\")\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page describes how you can group tasks in your Airflow pipelines\nusing the following design patterns:\n\n- Grouping tasks in the DAG graph.\n- Triggering children DAGs from a parent DAG.\n- Grouping tasks with the `TaskGroup` operator.\n\n| **Important:** Airflow provides [SubDAGs](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/dags.html#subdags) to address repeating tasks. Despite being a common design pattern for grouping tasks together, SubDAGs often cause performance and functional issues, and is deprecated in Airflow. We recommend to **avoid using SubDAGs to group tasks together** in your workflow and prefer one of the alternative approaches described in this page.\n\nGroup tasks in the DAG graph\n----------------------------\n\nTo group tasks in certain phases of your pipeline, you can use relationships\nbetween the tasks in your DAG file.\n\nConsider the following example:\n[](/static/composer/docs/images/workflow-group-dags.png) **Figure 1.** Tasks can be grouped together in an Airflow DAG (click to enlarge)\n\nIn this workflow, tasks `op-1` and `op-2` run together after the initial\ntask `start`. You can achieve this by grouping tasks together with the statement\n`start \u003e\u003e [task_1, task_2]`.\n\nThe following example provides a complete implementation of this DAG:\n\n\n from airflow import DAG\n from airflow.operators.bash import BashOperator\n from airflow.operators.dummy import DummyOperator\n from airflow.utils.dates import days_ago\n\n DAG_NAME = \"all_tasks_in_one_dag\"\n\n args = {\"owner\": \"airflow\", \"start_date\": days_ago(1), \"schedule_interval\": \"@once\"}\n\n with DAG(dag_id=DAG_NAME, default_args=args) as dag:\n start = DummyOperator(task_id=\"start\")\n\n task_1 = BashOperator(task_id=\"op-1\", bash_command=\":\", dag=dag)\n\n task_2 = BashOperator(task_id=\"op-2\", bash_command=\":\", dag=dag)\n\n some_other_task = DummyOperator(task_id=\"some-other-task\")\n\n task_3 = BashOperator(task_id=\"op-3\", bash_command=\":\", dag=dag)\n\n task_4 = BashOperator(task_id=\"op-4\", bash_command=\":\", dag=dag)\n\n end = DummyOperator(task_id=\"end\")\n\n start \u003e\u003e [task_1, task_2] \u003e\u003e some_other_task \u003e\u003e [task_3, task_4] \u003e\u003e end\n\n\u003cbr /\u003e\n\n\nTrigger children DAGs from a parent DAG\n---------------------------------------\n\nYou can trigger one DAG from another DAG with the\n[`TriggerDagRunOperator` operator](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/operators/trigger_dagrun/).\n\nConsider the following example:\n[](/static/composer/docs/images/workflow-trigger-dags.png) **Figure 2.** DAGs can be triggered from within a DAG with the TriggerDagRunOperator (click to enlarge)\n\nIn this workflow, the blocks `dag_1` and `dag_2` represent a series of tasks\nthat are grouped together in a separate DAG in the Cloud Composer\nenvironment.\n\nThe implementation of this workflow requires two separate DAG files.\nThe controlling DAG file looks like the following:\n\n\n from airflow import DAG\n from airflow.operators.dummy import DummyOperator\n from airflow.operators.trigger_dagrun import TriggerDagRunOperator\n from airflow.utils.dates import days_ago\n\n\n with DAG(\n dag_id=\"controller_dag_to_trigger_other_dags\",\n default_args={\"owner\": \"airflow\"},\n start_date=days_ago(1),\n schedule_interval=\"@once\",\n ) as dag:\n start = DummyOperator(task_id=\"start\")\n\n trigger_1 = TriggerDagRunOperator(\n task_id=\"dag_1\",\n trigger_dag_id=\"dag-to-trigger\", # Ensure this equals the dag_id of the DAG to trigger\n conf={\"message\": \"Hello World\"},\n )\n trigger_2 = TriggerDagRunOperator(\n task_id=\"dag_2\",\n trigger_dag_id=\"dag-to-trigger\", # Ensure this equals the dag_id of the DAG to trigger\n conf={\"message\": \"Hello World\"},\n )\n\n some_other_task = DummyOperator(task_id=\"some-other-task\")\n\n end = DummyOperator(task_id=\"end\")\n\n start \u003e\u003e trigger_1 \u003e\u003e some_other_task \u003e\u003e trigger_2 \u003e\u003e end\n\n\u003cbr /\u003e\n\n\n| **Note:** The value for `trigger_dag_id` inside `TriggerDagRunOperator` must match the `dag_id` value of the DAG you want to trigger.\n\nThe implementation of the child DAG, which is triggered by the controlling\nDAG, looks like the following:\n\n\n from airflow import DAG\n from airflow.operators.dummy import DummyOperator\n from airflow.utils.dates import days_ago\n\n DAG_NAME = \"dag-to-trigger\"\n\n args = {\"owner\": \"airflow\", \"start_date\": days_ago(1), \"schedule_interval\": \"None\"}\n\n with DAG(dag_id=DAG_NAME, default_args=args) as dag:\n dag_task = DummyOperator(task_id=\"dag-task\")\n\n\u003cbr /\u003e\n\n\nYou must [upload both DAG files](/composer/docs/composer-2/manage-dags#add)\nin your Cloud Composer environment for the DAG to work.\n\nGrouping tasks with the TaskGroup operator\n------------------------------------------\n\nYou can use the\n[`TaskGroup` operator](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/dags.html#taskgroups) to group tasks\ntogether in your DAG. Tasks defined within a `TaskGroup` block are still part\nof the main DAG.\n\nConsider the following example:\n[](/static/composer/docs/images/workflow-taskgroup-dag.png) **Figure 3.** Tasks can be visually grouped together in the UI with the TaskGroup operator (click to enlarge)\n\nThe tasks `op-1` and `op-2` are grouped together in a block with ID\n`taskgroup_1`. An implementation of this workflow looks like the following code: \n\n from airflow.models.dag import DAG\n from airflow.operators.bash import BashOperator\n from airflow.operators.dummy import DummyOperator\n from airflow.utils.dates import days_ago\n from airflow.utils.task_group import TaskGroup\n\n with DAG(dag_id=\"taskgroup_example\", start_date=days_ago(1)) as dag:\n start = DummyOperator(task_id=\"start\")\n\n with TaskGroup(\"taskgroup_1\", tooltip=\"task group #1\") as section_1:\n task_1 = BashOperator(task_id=\"op-1\", bash_command=\":\")\n task_2 = BashOperator(task_id=\"op-2\", bash_command=\":\")\n\n with TaskGroup(\"taskgroup_2\", tooltip=\"task group #2\") as section_2:\n task_3 = BashOperator(task_id=\"op-3\", bash_command=\":\")\n task_4 = BashOperator(task_id=\"op-4\", bash_command=\":\")\n\n some_other_task = DummyOperator(task_id=\"some-other-task\")\n\n end = DummyOperator(task_id=\"end\")\n\n start \u003e\u003e section_1 \u003e\u003e some_other_task \u003e\u003e section_2 \u003e\u003e end\n\nWhat's next\n-----------\n\n- [Write DAGs](/composer/docs/composer-2/write-dags)\n- [Test DAGs](/composer/docs/composer-2/test-dags)"]]