Trigger DAGs

Stay organized with collections Save and categorize content based on your preferences.

Cloud Composer 1 | Cloud Composer 2

This page describes different ways to trigger DAGs in Cloud Composer environments.

Airflow provides the following ways to trigger a DAG:

  • Trigger on a schedule. When you create a DAG, you specify a schedule for it. Airflow triggers the DAG automatically based on the specified scheduling parameters.

  • Trigger manually. You can trigger a DAG manually from the Airflow UI, or by running an Airflow CLI command from gcloud.

  • Trigger in response to events. The standard way to trigger a DAG in response to events is to use a sensor.

Other ways to trigger DAGs:

Trigger a DAG on a schedule

To trigger a DAG on a schedule:

  1. Specify the start_date and schedule_interval parameters in the DAG file, as described later in this section.
  2. Upload the DAG file to your environment.

Specify scheduling parameters

When you define a DAG, in the schedule_interval parameter, you specify how often you want to run the DAG. In the start_date parameter, you specify when you want Airflow to start scheduling your DAG. Tasks in your DAG can have individual start dates, or you can specify a single start date for all tasks. Based on the minimum start date for tasks in your DAG and on the schedule interval, Airflow schedules DAG runs.

Scheduling works in the following way. After the start_date passes, Airflow waits for the following occurrence of schedule_interval. Then it schedules the first DAG run to happen at the end of this schedule interval. For example, if a DAG is scheduled to run every hour (schedule_interval is 1 hour) and the start date is at 12:00 today, the first DAG run happens at 13:00 today.

The following example shows a DAG that runs every hour starting from 15:00 on April 5, 2021. With the parameters used in the example, Airflow schedules the first DAG run to happen at 16:00 on April 5, 2021.

from datetime import datetime
from airflow import DAG
from airflow.operators.bash_operator import BashOperator

with DAG(
    dag_id='example_dag_schedule',
    # At 15:00 on 5 April, 2021
    start_date=datetime(2021, 4, 5, 15, 0),
    # At minute 0 of every hour
    schedule_interval='0 * * * *') as dag:

    # Output the current date and time
    t1 = BashOperator(
        task_id='date',
        bash_command='date',
        dag=dag)

    t1

For more information about the scheduling parameters, see DAG Runs in the Airflow documentation.

More scheduling parameter examples

Following scheduling parameter examples illustrate how scheduling works with different combinations of parameters:

  • If start_date is datetime(2021, 4, 4, 16, 25) and schedule_interval is 30 16 * * *, then the first DAG run happens at 16:30 on 5 April, 2021.
  • If start_date is datetime(2021, 4, 4, 16, 35) and schedule_interval is 30 16 * * *, then the first DAG run happens at 16:30 on 6 April, 2021. Because the start date is after the schedule interval on 4 April, 2021, the DAG run does not happen on 5 April, 2021. Instead, the schedule interval ends at 16:35 on 5 April, 2021, so the next DAG run is scheduled for 16:30 on the following day.
  • If start_date is datetime(2021, 4, 4), and the schedule_interval is @daily, then the first DAG run is scheduled for 00:00 on April 5, 2021.
  • If start_date is datetime(2021, 4, 4, 16, 30), and the schedule_interval is 0 * * * *, then the first DAG run is scheduled for 18:00 on April 4, 2021. After the specified date and time passes, Airflow schedules a DAG run to happen at the minute 0 of every hour. The nearest point in time when this happens is 17:00. At this time, Airflow schedules a DAG run to happen at the end of the schedule interval, that is, at 18:00.

Trigger a DAG manually

When you trigger a DAG manually, Airflow performs a DAG run. For example, if you have a DAG that already runs on a schedule, and you trigger this DAG manually, then Airflow executes your DAG once, independently from the actual schedule specified for the DAG.

Console

DAG UI is supported in Cloud Composer 2.0.1 and later versions.

To trigger a DAG from Google Cloud console:

  1. In the Google Cloud console, go to the Environments page.

    Go to Environments

  2. Select an environment to view its details.

  3. On the Environment details page, go to the DAGs tab.

  4. Click the name of a DAG.

  5. On the DAG details page, click Trigger DAG. A new DAG run is created.

Airflow UI

To trigger a DAG from the Airflow web interface:

  1. In the Google Cloud console, go to the Environments page.

Go to Environments

  1. In the Airflow webserver column, follow the Airflow link for your environment.

  2. Log in with the Google account that has the appropriate permissions.

  3. In the Airflow web interface, on the DAGs page, in the Links column for your DAG, click the Trigger Dag button.

  4. (Optional) Specify the DAG run configuration.

  5. Click Trigger.

gcloud

Run the dags trigger Airflow CLI command:

  gcloud composer environments run ENVIRONMENT_NAME \
    --location LOCATION \
    dags trigger -- DAG_ID

Replace:

  • ENVIRONMENT_NAME with the name of the environment.
  • LOCATION with the region where the environment is located.
  • DAG_ID with the name of the DAG.

For more information about running Airflow CLI commands in Cloud Composer environments, see Running Airflow CLI commands.

For more information about the available Airflow CLI commands, see the gcloud composer environments run command reference.

What's next