Migrate environments to Cloud Composer 2 (from Airflow 2) using snapshots

Cloud Composer 3 | Cloud Composer 2 | Cloud Composer 1

This page explains how to transfer DAGs, data and configuration from your existing Cloud Composer 1, Airflow 2 environments to Cloud Composer 2, Airflow 2.

This migration guide uses the Snapshots feature.

Other migration guides

From To Method Guide
Cloud Composer 1, Airflow 2 Cloud Composer 2, Airflow 2 Side-by-side, using snapshots This guide (snapshots)
Cloud Composer 1, Airflow 1 Cloud Composer 2, Airflow 2 Side-by-side, using snapshots Migration guide (snapshots)
Cloud Composer 1, Airflow 2 Cloud Composer 2, Airflow 2 Side-by-side, manual transfer Manual migration guide
Cloud Composer 1, Airflow 1 Cloud Composer 2, Airflow 2 Side-by-side, manual transfer Manual migration guide
Airflow 1 Airflow 2 Side-by-side, manual transfer Manual migration guide

Before you begin

  • Snapshots are supported in Cloud Composer 2 version 2.0.9 and later. Cloud Composer 1 supports saving environment snapshots in 1.18.5.

  • Cloud Composer supports side-by-side migration from Cloud Composer 1 to Cloud Composer 2. It is not possible to upgrade from Cloud Composer 1 to Cloud Composer 2 in-place.

  • Check the list of differences between Cloud Composer 1 and Cloud Composer 2.

  • The maximum size of the Airflow database that supports snapshots is 20 GB. If your environment's database takes more than 20 GB, reduce the size of the Airflow database.

  • The total number of objects in the /dags, /plugins and /data folders in the environment's bucket must be less than 100,000 to create snapshots.

  • If you use the XCom mechanism to transfer files, make sure that you use it according to Airflow's guidelines. Transferring big files or a large number of files using XCom impacts Airflow database's performance and can lead to failures when loading snapshots or upgrading your environment. Consider using alternatives such as Cloud Storage to transfer large volumes of data.

Step 1: Pause DAGs in your Cloud Composer 1 environment

To avoid duplicate DAG runs, pause all DAGs in your Cloud Composer 1 environment before saving its snapshot.

You can use any of the following options:

  • In the Airflow web interface, go to DAGs and pause all DAGs manually.

  • Use the composer_dags script to pause all DAGs:

    python3 composer_dags.py --environment COMPOSER_1_ENV \
      --project PROJECT_ID \
      --location COMPOSER_1_LOCATION \
      --operation pause
    

    Replace:

    • COMPOSER_1_ENV with the name of your Cloud Composer 1 environment.
    • PROJECT_ID with the Project ID.
    • COMPOSER_1_LOCATION with the region where the environment is located.
  • (Airflow versions 2.9.1 and later) If there are quota errors while pausing a large number of DAGs, you can use the following Airflow CLI commands to pause all DAGs at once:

    gcloud composer environments run COMPOSER_1_ENV dags pause \
      --project PROJECT_ID \
      --location COMPOSER_1_LOCATION \
      -- -y --treat-dag-id-as-regex ".*"
    
  • (Airflow versions earlier than 2.9.1) If there are quota errors while pausing a large number of DAGs, it's possible to pause DAGs using the Airflow REST API. Also see Trying the API in the Airflow documentation.

Step 2: Save the snapshot of your Cloud Composer 1 environment

Console

Create a snapshot of your environment:

  1. In Google Cloud console, go to the Environments page.

    Go to Environments

  2. In the list of environments, click the name of your Cloud Composer 1 environment. The Environment details page opens.

  3. Click Create snapshot.

  4. In the Create snapshot dialog, click Submit. In this guide, you save the snapshot in the Cloud Composer 1 environment's bucket, but you can select a different location, if you want to.

  5. Wait until Cloud Composer creates the snapshot.

gcloud

  1. Get your Cloud Composer 1 environment's bucket URI:

    1. Run the following command:

      gcloud composer environments describe COMPOSER_1_ENV \
          --location COMPOSER_1_LOCATION \
           --format="value(config.dagGcsPrefix)"
      

      Replace:

      • COMPOSER_1_ENV with the name of your Cloud Composer 1 environment.
      • COMPOSER_1_LOCATION with the region where the environment is located.
    2. In the output, remove the /dags folder. The result is the URI of your Cloud Composer 1 environment's bucket.

      For example, change gs://us-central1-example-916807e1-bucket/dags to gs://us-central1-example-916807e1-bucket.

  2. Create a snapshot of your Cloud Composer 1 environment:

    gcloud composer environments snapshots save \
      COMPOSER_1_ENV \
      --location COMPOSER_1_LOCATION \
      --snapshot-location "COMPOSER_1_SNAPSHOTS_FOLDER"
    

    Replace:

    • COMPOSER_1_ENV with the name of your Cloud Composer 1 environment.
    • COMPOSER_1_LOCATION with the region where the Cloud Composer 1 environment is located.
    • COMPOSER_1_SNAPSHOTS_FOLDER with the URI of your Cloud Composer 1 environment's bucket. In this guide, you save the snapshot in the Cloud Composer 1 environment's bucket, but you can select a different location, if you want to. If you specify a custom location, the service accounts of both environments must have read and write permissions for the specified location.

Step 3: Create a Cloud Composer 2 environment

Create a Cloud Composer 2 environment. You can start with an environment preset that matches your expected resource demands, and later scale and optimize your environment further.

You do not need to specify configuration overrides and environment variables, since you replace them later when you load the snapshot of your Cloud Composer 1 environment.

Step 4: Load the snapshot to your Cloud Composer 2 environment

Console

To load the snapshot to your Cloud Composer 2 environment:

  1. In Google Cloud console, go to the Environments page.

    Go to Environments

  2. In the list of environments, click the name of your Cloud Composer 2 environment. The Environment details page opens.

  3. Click Load snapshot.

  4. In the Load snapshot dialog, click Browse.

  5. Select the folder with the snapshot. If you use the default location for this guide, this folder is located in your Cloud Composer 1 environment bucket in the /snapshots folder, and its name is the timestamp of the snapshot save operation. For example, us-central1-example-916807e1-bucket/snapshots_example-project_us-central1_example-environment/2022-01-05T18-59-00.

  6. Click Load and wait until Cloud Composer loads the snapshot.

gcloud

Load the snapshot of your Cloud Composer 1 environment to your Cloud Composer 2 environment:

gcloud composer environments snapshots load \
  COMPOSER_2_ENV \
  --location COMPOSER_2_LOCATION \
  --snapshot-path "SNAPSHOT_PATH"

Replace:

  • COMPOSER_2_ENV with the name of your Cloud Composer 2 environment.
  • COMPOSER_2_LOCATION with the region where the Cloud Composer 2 environment is located.
  • SNAPSHOT_PATH with the URI of your Cloud Composer 1 environment's bucket, followed by the path to the snapshot. For example, gs://us-central1-example-916807e1-bucket/snapshots/example-project_us-central1_example-environment_2022-01-05T18-59-00.

Step 5: Unpause DAGs in the Cloud Composer 2 environment

You can use any of the following options:

  • In the Airflow web interface, go to DAGs and unpause all DAGs manually one by one.

  • Use the composer_dags script to unpause all DAGs:

    python3 composer_dags.py --environment COMPOSER_2_ENV \
      --project PROJECT_ID \
      --location COMPOSER_2_LOCATION \
      --operation unpause
    

    Replace:

    • COMPOSER_2_ENV with the name of your Cloud Composer 2 environment.
    • PROJECT_ID with the Project ID.
    • COMPOSER_2_LOCATION with the region where the environment is located.
  • (Airflow versions 2.9.1 and later) If there are quota errors while unpausing a large number of DAGs, you can use the following Airflow CLI commands to unpause all DAGs at once:

    gcloud composer environments run COMPOSER_2_ENV dags unpause \
      --project PROJECT_ID \
      --location COMPOSER_2_LOCATION \
      -- -y --treat-dag-id-as-regex ".*"
    
  • (Airflow versions earlier than 2.9.1) If there are quota errors while unpausing a large number of DAGs, it's possible to unpause DAGs using the Airflow REST API. Also see Trying the API in the Airflow documentation.

Step 6: Check for DAG errors

  1. In the Airflow web interface, go to DAGs and check for reported DAG syntax errors.

  2. Check that DAG runs are scheduled at the correct time.

  3. Wait for the DAG runs to happen in the Cloud Composer 2 environment and check if they were successful. If a DAG run was successful, do not unpause it in the Cloud Composer 1 environment; if you do so, a DAG run for the same time and date happens in your Cloud Composer 1 environment.

  4. If a specific DAG runs fails, attempt to troubleshoot the DAG until it successfully runs in Cloud Composer 2.

Step 7: Monitor your Cloud Composer 2 environment

After you transfer all DAGs and configuration to the Cloud Composer 2 environment, monitor it for potential issues, failed DAG runs, and overall environment health.

If the Cloud Composer 2 environment runs without problems for a sufficient period of time, consider deleting the Cloud Composer 1 environment.

What's next