Airflow Command-line Interface

Apache Airflow includes a command-line interface (CLI) that you can use to manage Airflow environments. The CLI is useful for tasks such as managing workflows, changing the Airflow environment, and obtaining log information.

Cloud Composer simplifies Airflow CLI commands with the Google Cloud SDK. Once installed, you can use the gcloud command line tool to run the gcloud composer environments run command to execute Airflow CLI sub-commands.

Before you begin

The following permissions are required to access the gcloud command-line tool (Airflow CLI) in the Cloud Composer environment:

  • composer.environments.get
  • container.clusters.get
  • container.clusters.list
  • container.clusters.getCredentials

For more information, see Cloud Composer Access Control.

Running Airflow CLI commands

You can run Airflow CLI commands on a Cloud Composer environment by using the following command:

gcloud composer environments run ENVIRONMENT_NAME \
    --location LOCATION SUBCOMMAND

Most gcloud composer commands require a location. You can specify the location by using the --location flag or by setting the default location.

For example, to trigger a DAG named sample_quickstart with the ID 5077 in your Cloud Composer environment:

gcloud composer environments run test-environment \
    --location us-central1 trigger_dag -- sample_quickstart \
    --run_id=5077

For example, to check for syntax errors in DAGs in a test/ directory:

gcloud composer environments run test-environment \
     --location us-central1 \
     list_dags -- -sd /home/airflow/gcs/data/test

Running commands on a private IP environment

To run Airflow CLI commands on a private IP environment, you must run them on a machine that can access the GKE cluster's master endpoint. Your options may vary depending on your private cluster configuration.

If public endpoint access is disabled in your cluster, you must run Airflow commands from a VM in the same VPC network. Create a VM in your VPC network to enable this path.

If public endpoint access and master authorized networks are enabled, you can also run Airflow commands from a machine with a public IP address that has been added to master authorized networks. To enable access from your machine, adjust the configuration of the GKE cluster:

  1. Find the name of the GKE cluster that runs your Composer environment by using this command:

    gcloud beta composer environments describe ENVIRONMENT_NAME \
        --location LOCATION \
        --format="value(config.gkeCluster)"
    
  2. Find the public IP of the machine where you want to run Airflow CLI commands on your environment. If you're using Cloud Shell, use dig to find the external IP address of your Cloud Shell:

    dig +short myip.opendns.com @resolver1.opendns.com
    
  3. Add the external address of your machine to your cluster's list of master authorized networks:

    gcloud container clusters update GKE_CLUSTER \
        --enable-master-authorized-networks \
        --master-authorized-networks EXISTING_AUTH_NETS,SHELL_IP/32
    

    Where:

    • EXISTING_AUTH_NETS is your existing list of master authorized networks. You can find your master authorized networks by running the following command:

      gcloud container clusters describe GKE_CLUSTER \
          --format "flattened(masterAuthorizedNetworksConfig.cidrBlocks[])"
      
    • SHELL_IP is the external IP address of your Cloud Shell.

You should now be able to run Airflow CLI commands on your private IP environment.

Supported Airflow commands

Cloud Composer supports the following commands for environments running Airflow 1.9.0 and newer:

  • backfill
  • clear
  • connections
  • dag_state
  • kerberos
  • list_dags
  • list_tasks
  • pause
  • pool
  • render
  • run
  • task_failed_deps
  • task_state
  • test
  • trigger_dag
  • unpause
  • variables
  • version

Airflow 1.10.1 adds support for the following command:

  • delete_dag

Airflow 1.10.2 adds support for the following commands:

  • list_dag_runs
  • next_execution