[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-08-26 (世界標準時間)。"],[[["\u003cp\u003eUpdating Cloud Composer environments involves changing parameters like scaling, performance, or installing custom PyPI packages, with only one update operation allowed at a time per environment.\u003c/p\u003e\n"],["\u003cp\u003eChanges to certain settings, such as environment version upgrades, adding or changing PyPI packages, or modifying Airflow worker resources, will result in the termination of all running Airflow tasks, which will then be retried based on configured DAG settings.\u003c/p\u003e\n"],["\u003cp\u003eCloud Composer environments after version 2.4.4 can have up to 10 triggerers, but each is limited to a maximum of 1 vCPU, and environment updates will fail if this limit is exceeded, requiring configuration adjustments.\u003c/p\u003e\n"],["\u003cp\u003eTerraform users should use \u003ccode\u003eterraform plan\u003c/code\u003e to verify whether a parameter change will result in the environment being deleted and a new one created, rather than just being updated.\u003c/p\u003e\n"],["\u003cp\u003eDuring updates, Airflow components are restarted and reinitialized, therefore, to ensure smooth execution, the /dags and /plugins folders should be kept lean, preferably under 30 MB and definitively under 100 MB.\u003c/p\u003e\n"]]],[],null,["# Update Cloud Composer environments\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n[Cloud Composer 3](/composer/docs/composer-3/update-environments \"View this page for Cloud Composer 3\") \\| **Cloud Composer 2** \\| [Cloud Composer 1](/composer/docs/composer-1/update-environments \"View this page for Cloud Composer 1\")\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page explains how an environment can be updated.\n\nAbout update operations\n-----------------------\n\nWhen you change parameters of your environment, such as specifying new scaling\nand performance parameters, or installing custom PyPI packages, your\nenvironment updates.\n\nAfter this operation is completed, changes become available in your\nenvironment.\n\nFor a single Cloud Composer environment, you can start only one\nupdate operation at a time. You must wait for an update operation to complete\nbefore starting another environment operation.\n| **Important:** Do not upgrade the GKE clusters of your Cloud Composer environments to newer GKE versions. Use the GKE version pre-configured during the environment creation and rely on auto-upgrades performed by the GKE service.\n\n### Triggerer CPU limits\n\nCloud Composer in version 2.4.4 introduces a different performance\nscaling approach for the [Airflow triggerer](/composer/docs/composer-2/environment-architecture) component that applies to\nall Cloud Composer 2 versions.\n\nBefore version 2.4.4, Cloud Composer environments could\nuse a maximum of 1 or 2 triggerers.\nAfter the change, you can have up to 10 triggerers per environment,\nbut each triggerer is limited to a maximum of 1 vCPU.\n\nEnvironment update operations fail if your environment is configured with more\nthan 1 vCPU per triggerer. You must adjust the configuration to meet the 1 vCPU limit to perform updates on other components.\n\nFor more information, see:\n\n- [Configure triggerer resource allocation](/composer/docs/composer-2/scale-environments#workloads-configuration)\n- [Adjust triggerer count](/composer/docs/composer-2/scale-environments#triggerer-count)\n- [Troubleshoot environment upgrade - triggerer CPU exceeded](/composer/docs/composer-2/troubleshooting-updates-upgrades#triggerer-cpu-limit)\n\nHow updates affect running Airflow tasks\n----------------------------------------\n\n| **Caution:** Some update operations **terminate all running tasks**.\n\nWhen you [run an update operation](#update-operations), Airflow schedulers and\nworkers in your environment might require a restart. In this case, all\ncurrently running tasks are terminated. After the update operation is\ncompleted, Airflow schedules these tasks for a retry, depending on the way you\nconfigure retries for your DAGs.\n| **Note:** Airflow workers can get restarted as part of the environment maintenance, during the maintenance windows.\n\nThe following changes **cause** Airflow task termination:\n\n- Upgrading your environment to a new version.\n- Adding, changing, or deleting custom PyPI packages.\n- Changing Cloud Composer environment variables.\n- Adding or removing Airflow configuration options overrides, or changing their values.\n- Changing Airflow workers' CPU, memory, or storage.\n- Reducing the maximum number of Airflow workers, if the new value is\n lower than the number of currently running workers. For example, if an\n environment currently runs three workers, and the maximum is reduced to two.\n\n- Changing environment's resilience mode.\n\nThe following changes **don't cause** Airflow task termination:\n\n- Creating, updating, or deleting a DAG (not an update operation).\n- Pausing or unpausing DAGs (not an update operation).\n- Changing Airflow variables (not an update operation).\n- Changing Airflow connections (not an update operation).\n- Enabling or disabling Dataplex Universal Catalog Data Lineage integration.\n- Changing environment's size.\n- Changing the number of schedulers.\n- Changing Airflow schedulers' CPU, memory, or storage.\n- Changing the number of triggerers.\n- Changing Airflow triggerers' CPU, memory, or storage.\n- Changing Airflow web server's CPU, memory, or storage.\n- Increasing or decreasing the minimum number of workers.\n- Reducing the maximum number of Airflow workers. For example, if an environment currently runs two workers, and the maximum is reduced to three.\n- Changing maintenance windows.\n- Changing scheduled snapshots settings.\n- Changing environment labels.\n\nUpdating with Terraform\n-----------------------\n\n| **Warning:** If you attempt to change a configuration parameter that cannot be updated, Terraform **deletes your environment and creates a new one** with the new parameter value.\n\nRun `terraform plan` before `terraform apply` to see if Terraform creates a new\nenvironment instead of updating it.\n\nBefore you begin\n----------------\n\n- Check that your account, the service account of your environment, and\n the Cloud Composer Service Agent account in your project have\n required permissions:\n\n - Your account must have a role that\n [can trigger environment update operations](/composer/docs/composer-2/access-control#user-account).\n\n - The service account of your environment must have a role that\n [has enough permissions to perform update operations](/composer/docs/composer-2/access-control#service-account).\n\n - The Cloud Composer Service Agent account must have\n [permissions to create bindings](/composer/docs/composer-2/access-control#composer-sa) between\n your environment's service account and the Kubernetes service account of\n your environment's cluster.\n\n- The `gcloud composer environments update` command terminates when the\n operation is finished. You can use the `--async` flag to avoid waiting for\n the operation to complete.\n\nUpdate environments\n-------------------\n\nFor more information about updating your environment, see other documentation\npages about specific update operations. For example:\n\n- [Override Airflow configuration options](/composer/docs/composer-2/override-airflow-configurations)\n- [Set environment variables](/composer/docs/composer-2/set-environment-variables)\n- [Install Python dependencies](/composer/docs/composer-2/install-python-dependencies)\n- [Scale environments](/composer/docs/composer-2/scale-environments)\n\n- [Update environments to high resilience](/composer/docs/composer-2/set-up-highly-resilient-environments)\n\nView environment details\n------------------------\n\n### Console\n\n1. In Google Cloud console, go to the **Environments** page.\n\n [Go to Environments](https://console.cloud.google.com/composer/environments)\n2. In the list of environments, click the name of your environment.\n The **Environment details** page opens.\n\n### gcloud\n\nRun the following `gcloud` command: \n\n gcloud composer environments describe \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e\n\nReplace:\n\n- `ENVIRONMENT_NAME` with the name of the environment.\n- `LOCATION` with the region where the environment is located.\n\n### API\n\nConstruct an [`environments.get`](/composer/docs/reference/rest/v1/projects.locations.environments/get) API request.\n\nExample: \n\n GET https://composer.googleapis.com/v1/projects/example-project/\n locations/us-central1/environments/example-environment\n\n### Terraform\n\nRun the `terraform state show` command for your environment's resource.\n\nThe name of your environment's Terraform resource might be different than the\nname of your environment. \n\n terraform state show google_composer_environment.\u003cvar translate=\"no\"\u003eRESOURCE_NAME\u003c/var\u003e\n\nReplace:\n\n- `RESOURCE_NAME` with the name of your environment's resource.\n\n### Rolling back update changes\n\nIn some rare situations, an update operation might be interrupted\n(for example, because of a timeout) and the requested changes might not be\nrolled back in all environment components (such as the Airflow web server).\n\nFor example, an update operation might be installing or removing additional\nPyPI modules, re-defining or defining a new Airflow or Cloud Composer\nenvironment variable, or changing some Airflow-related parameters.\n\nSuch a situation might occur if an update operation is triggered when other\noperations are in progress, for example Cloud Composer cluster's\nautoscaling or a maintenance operation.\n\nIn such a situation, it's recommended to repeat the operation.\n\n### Duration of update or upgrade operations\n\nThe duration of update and upgrade operations is affected by the following\nfactors:\n\n- Most update or upgrade operations require restarting Airflow components\n like Airflow schedulers, workers and web servers. After a component is\n restarted, it must be initialized. During the initialization, Airflow\n schedulers and workers download the contents of `/dags` and `/plugins`\n folders from the environment's bucket. The process of syncing files to\n Airflow schedulers and workers isn't instantaneous and depends on the total\n size and number of all objects in these folders.\n\n We recommend to keep only DAG and plugin files in `/dags` and `/plugins`\n folders (respectively) and remove all other files. Too much data\n in `/dags` and `/plugins` folders might slow down the initialization of\n Airflow components and in certain cases might make the initialization not\n possible.\n\n We recommend to keep less than 30 MB of data in `/dags` and `/plugins`\n folders, and to definitely not exceed 100 MB size of data. For more\n information, also see\n [Handling large number of DAGs and plugins](/composer/docs/composer-2/troubleshooting-dags#large-number-of-dags)\n- The size of the Airflow database might significantly increase the time of\n upgrade operations. We recommend to maintain the Airflow database size by\n\n [maintaining the Airflow database](/composer/docs/composer-2/cleanup-airflow-database) of your environment.\n\nWhat's next\n-----------\n\n- [Upgrade environments](/composer/docs/composer-2/upgrade-environments)\n- [Override Airflow configuration options](/composer/docs/composer-2/override-airflow-configurations)"]]