Update environments

Cloud Composer 1 | Cloud Composer 2

This page explains how an environment can be updated.

About update operations

When you change parameters of your environment, such as specifying new scaling and performance parameters, or installing custom PyPI packages, your environment updates.

After this operation is completed, changes become available in your environment.

For a single Cloud Composer environment, you can start only one update operation at a time. You must wait for an update operation to complete before starting another environment operation.

Triggerer CPU limits

Cloud Composer in version 2.4.4 introduces a different performance scaling approach for the Airflow triggerer component that applies to all Cloud Composer 2 versions.

Before version 2.4.4, Cloud Composer environments could use a maximum of 1 or 2 triggerers. After the change, you can have up to 10 triggerers per environment, but each triggerer is limited to a maximum of 1 vCPU.

Environment update operations fail if your environment is configured with more than 1 vCPU per triggerer. You must adjust the configuration to meet the 1 vCPU limit to perform updates on other components.

For more information, see:

How updates affect running Airflow tasks

When you run an update operation, such as installing custom PyPI packages, all Airflow schedulers and workers in your environment restart, and all currently running tasks are terminated. After the update operation is completed, Airflow schedules these tasks for a retry, depending on the way you configure retries for your DAGs.

Updating with Terraform

Run terraform plan before terraform apply to see if Terraform creates a new environment instead of updating it.

Before you begin

  • Check that your account, the service account of your environment, and the Cloud Composer Service Agent account in your project have required permissions:

  • The gcloud composer environments update command terminates when the operation is finished. You can use the --async flag to avoid waiting for the operation to complete.

Update environments

For more information about updating your environment, see other documentation pages about specific update operations. For example:

View environment details


  1. In Google Cloud console, go to the Environments page.

    Go to Environments

  2. In the list of environments, click the name of your environment. The Environment details page opens.


Run the following gcloud command:

gcloud composer environments describe ENVIRONMENT_NAME \
  --location LOCATION


  • ENVIRONMENT_NAME with the name of the environment.
  • LOCATION with the region where the environment is located.


Construct an environments.get API request.


GET https://composer.googleapis.com/v1/projects/example-project/


Run the terraform state show command for your environment's resource.

The name of your environment's Terraform resource might be different than the name of your environment.

terraform state show google_composer_environment.RESOURCE_NAME


  • RESOURCE_NAME with the name of your environment's resource.

Rolling back update changes

In some rare situations, an update operation might be interrupted (for example, because of a timeout) and the requested changes might not be rolled back in all environment components (such as the Airflow webserver).

For example, an update operation might be installing or removing additional PyPI modules, re-defining or defining a new Airflow or Cloud Composer environment variable, or changing some Airflow-related parameters.

Such a situation might occur if an update operation is triggered when other operations are in progress, for example Cloud Composer cluster's autoscaling or a maintenance operation.

In such a situation, it's recommended to repeat the operation.

Duration of update or upgrade operations

Most update or upgrade operations require restarting Airflow components like Airflow schedulers, workers and web servers.

Once a component is restarted, it must be initialized. During the initialization, Airflow schedulers and workers download the contents of /dags and /plugins folders from the environment's bucket. The process of syncing files to Airflow schedulers and workers is not instantaneous and depends on the total size and number of all objects in these folders.

We recommend to keep only DAG and plugin files in /dags and /plugins folders (respectively) and remove all other files. Too much data in /dags and /plugins folders might slow down the initialization of Airflow components and in certain cases might make the initialization not possible.

We recommend to keep less than 30 MB of data in /dags and /plugins folders, and to definitely not exceed 100 MB size of data.

For more information, also see:

What's next