업데이트 작업을 실행하면 환경의 Airflow 스케줄러 및 작업자를 다시 시작해야 할 수 있습니다. 이 경우 현재 실행 중인 모든 태스크가 종료됩니다. 업데이트 작업이 완료되면 Airflow가 DAG에 대한 재시도를 구성하는 방법에 따라 이러한 태스크를 재시도하도록 예약합니다.
다음과 같은 변경사항은 Airflow 태스크 종료를 유발합니다.
환경을 새 버전으로 업그레이드
커스텀 PyPI 패키지 추가, 변경 또는 삭제
Cloud Composer 환경 변수 변경
Airflow 구성 옵션 재정의를 추가, 삭제 또는 해당 값 변경
Airflow 작업자의 CPU, 메모리 또는 스토리지 변경
새 값이 현재 실행 중인 작업자 수보다 낮으면 최대 Airflow 작업자 수 감소. 예를 들어 환경에서 현재 3명의 작업자를 실행하고 최댓값이 2로 감소하는 경우입니다.
환경의 복원력 모드 변경
다음 변경사항으로 인해 Airflow 태스크가 종료되지 않습니다.
DAG 생성, 업데이트 또는 삭제(업데이트 작업 아님)
DAG 일시중지 또는 재개(업데이트 작업 아님)
Airflow 변수 변경(업데이트 작업 아님)
Airflow 연결 변경(업데이트 작업 아님)
Dataplex 범용 카탈로그 데이터 계보 통합 사용 설정 또는 사용 중지
환경 크기 변경
스케줄러 수 변경
Airflow 스케줄러의 CPU, 메모리 또는 스토리지 변경
트리거 수 변경
Airflow 트리거의 CPU, 메모리 또는 스토리지 변경
Airflow 웹 서버의 CPU, 메모리 또는 스토리지 변경
최소 작업자 수 증가 또는 감소
최대 Airflow 작업자 수를 줄입니다. 예를 들어 환경에서 현재 2명의 작업자를 실행하고 최댓값이 3으로 감소하는 경우입니다.
유지보수 기간 변경
예약된 스냅샷 설정 변경
환경 라벨 변경
Terraform을 사용한 업데이트
Terraform이 업데이트 대신 새 환경을 만드는지 확인하려면 terraform apply 전에 terraform plan을 실행합니다.
시작하기 전에
계정, 환경의 서비스 계정, 프로젝트의 Cloud Composer 서비스 에이전트 계정에 필요한 권한이 있는지 확인합니다.
드물지만 시간 초과 등으로 인해 업데이트 작업이 중단되고 Airflow 웹 서버와 같이 모든 환경 구성요소에서 요청된 변경사항이 롤백되지 않을 수 있습니다.
예를 들어 업데이트 작업에 따라 추가 PyPI 모듈을 설치 또는 제거하거나, 새로운 Airflow 또는 Cloud Composer 환경 변수를 다시 정의 또는 정의하거나, 일부 Airflow 관련 매개변수를 변경할 수 있습니다.
이러한 상황은 Cloud Composer 클러스터의 자동 확장 또는 유지보수 작업과 같은 다른 작업이 진행 중일 때 업데이트 작업이 트리거된 경우에 발생할 수 있습니다.
이러한 경우에는 작업을 반복하는 것이 좋습니다.
업데이트 또는 업그레이드 작업 기간
업데이트 및 업그레이드 작업 기간은 다음 요인의 영향을 받습니다.
대부분의 업데이트 또는 업그레이드 작업에서는 Airflow 스케줄러, 작업자, 웹 서버와 같은 Airflow 구성요소를 다시 시작해야 합니다. 구성요소가 다시 시작된 경우 이를 초기화해야 합니다. 초기화 중에 Airflow 스케줄러와 작업자는 환경 버킷에서 /dags 및 /plugins 폴더 콘텐츠를 다운로드합니다. Airflow 스케줄러와 작업자에 파일을 동기화하는 프로세스는 즉시 수행되지 않으며 이러한 폴더에 있는 모든 객체의 총 크기와 개수에 따라 달라집니다.
각각 /dags 및 /plugins 폴더에 있는 DAG 및 플러그인 파일만 유지하고 다른 모든 파일은 삭제하는 것이 좋습니다. /dags 및 /plugins 폴더에 데이터가 너무 많으면 Airflow 구성요소 초기화가 느려지고 일부 경우에 초기화가 불가능해질 수 있습니다.
/dags 및 /plugins 폴더에서 데이터를 30MB 미만으로 유지하고 데이터가 100MB를 초과하지 않도록 하는 것이 좋습니다. 자세한 내용은 다수의 DAG 및 플러그인 처리를 참고하세요.
Airflow 데이터베이스의 크기로 인해 업그레이드 작업 시간이 크게 늘어날 수 있습니다. 환경의 Airflow 데이터베이스를 유지보수하여 Airflow 데이터베이스 크기를 유지하는 것이 좋습니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-09-03(UTC)"],[[["\u003cp\u003eUpdating Cloud Composer environments involves changing parameters like scaling, performance, or installing custom PyPI packages, with only one update operation allowed at a time per environment.\u003c/p\u003e\n"],["\u003cp\u003eChanges to certain settings, such as environment version upgrades, adding or changing PyPI packages, or modifying Airflow worker resources, will result in the termination of all running Airflow tasks, which will then be retried based on configured DAG settings.\u003c/p\u003e\n"],["\u003cp\u003eCloud Composer environments after version 2.4.4 can have up to 10 triggerers, but each is limited to a maximum of 1 vCPU, and environment updates will fail if this limit is exceeded, requiring configuration adjustments.\u003c/p\u003e\n"],["\u003cp\u003eTerraform users should use \u003ccode\u003eterraform plan\u003c/code\u003e to verify whether a parameter change will result in the environment being deleted and a new one created, rather than just being updated.\u003c/p\u003e\n"],["\u003cp\u003eDuring updates, Airflow components are restarted and reinitialized, therefore, to ensure smooth execution, the /dags and /plugins folders should be kept lean, preferably under 30 MB and definitively under 100 MB.\u003c/p\u003e\n"]]],[],null,["# Update Cloud Composer environments\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n[Cloud Composer 3](/composer/docs/composer-3/update-environments \"View this page for Cloud Composer 3\") \\| **Cloud Composer 2** \\| [Cloud Composer 1](/composer/docs/composer-1/update-environments \"View this page for Cloud Composer 1\")\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page explains how an environment can be updated.\n\nAbout update operations\n-----------------------\n\nWhen you change parameters of your environment, such as specifying new scaling\nand performance parameters, or installing custom PyPI packages, your\nenvironment updates.\n\nAfter this operation is completed, changes become available in your\nenvironment.\n\nFor a single Cloud Composer environment, you can start only one\nupdate operation at a time. You must wait for an update operation to complete\nbefore starting another environment operation.\n| **Important:** Do not upgrade the GKE clusters of your Cloud Composer environments to newer GKE versions. Use the GKE version pre-configured during the environment creation and rely on auto-upgrades performed by the GKE service.\n\n### Triggerer CPU limits\n\nCloud Composer in version 2.4.4 introduces a different performance\nscaling approach for the [Airflow triggerer](/composer/docs/composer-2/environment-architecture) component that applies to\nall Cloud Composer 2 versions.\n\nBefore version 2.4.4, Cloud Composer environments could\nuse a maximum of 1 or 2 triggerers.\nAfter the change, you can have up to 10 triggerers per environment,\nbut each triggerer is limited to a maximum of 1 vCPU.\n\nEnvironment update operations fail if your environment is configured with more\nthan 1 vCPU per triggerer. You must adjust the configuration to meet the 1 vCPU limit to perform updates on other components.\n\nFor more information, see:\n\n- [Configure triggerer resource allocation](/composer/docs/composer-2/scale-environments#workloads-configuration)\n- [Adjust triggerer count](/composer/docs/composer-2/scale-environments#triggerer-count)\n- [Troubleshoot environment upgrade - triggerer CPU exceeded](/composer/docs/composer-2/troubleshooting-updates-upgrades#triggerer-cpu-limit)\n\nHow updates affect running Airflow tasks\n----------------------------------------\n\n| **Caution:** Some update operations **terminate all running tasks**.\n\nWhen you [run an update operation](#update-operations), Airflow schedulers and\nworkers in your environment might require a restart. In this case, all\ncurrently running tasks are terminated. After the update operation is\ncompleted, Airflow schedules these tasks for a retry, depending on the way you\nconfigure retries for your DAGs.\n| **Note:** Airflow workers can get restarted as part of the environment maintenance, during the maintenance windows.\n\nThe following changes **cause** Airflow task termination:\n\n- Upgrading your environment to a new version.\n- Adding, changing, or deleting custom PyPI packages.\n- Changing Cloud Composer environment variables.\n- Adding or removing Airflow configuration options overrides, or changing their values.\n- Changing Airflow workers' CPU, memory, or storage.\n- Reducing the maximum number of Airflow workers, if the new value is\n lower than the number of currently running workers. For example, if an\n environment currently runs three workers, and the maximum is reduced to two.\n\n- Changing environment's resilience mode.\n\nThe following changes **don't cause** Airflow task termination:\n\n- Creating, updating, or deleting a DAG (not an update operation).\n- Pausing or unpausing DAGs (not an update operation).\n- Changing Airflow variables (not an update operation).\n- Changing Airflow connections (not an update operation).\n- Enabling or disabling Dataplex Universal Catalog Data Lineage integration.\n- Changing environment's size.\n- Changing the number of schedulers.\n- Changing Airflow schedulers' CPU, memory, or storage.\n- Changing the number of triggerers.\n- Changing Airflow triggerers' CPU, memory, or storage.\n- Changing Airflow web server's CPU, memory, or storage.\n- Increasing or decreasing the minimum number of workers.\n- Reducing the maximum number of Airflow workers. For example, if an environment currently runs two workers, and the maximum is reduced to three.\n- Changing maintenance windows.\n- Changing scheduled snapshots settings.\n- Changing environment labels.\n\nUpdating with Terraform\n-----------------------\n\n| **Warning:** If you attempt to change a configuration parameter that cannot be updated, Terraform **deletes your environment and creates a new one** with the new parameter value.\n\nRun `terraform plan` before `terraform apply` to see if Terraform creates a new\nenvironment instead of updating it.\n\nBefore you begin\n----------------\n\n- Check that your account, the service account of your environment, and\n the Cloud Composer Service Agent account in your project have\n required permissions:\n\n - Your account must have a role that\n [can trigger environment update operations](/composer/docs/composer-2/access-control#user-account).\n\n - The service account of your environment must have a role that\n [has enough permissions to perform update operations](/composer/docs/composer-2/access-control#service-account).\n\n - The Cloud Composer Service Agent account must have\n [permissions to create bindings](/composer/docs/composer-2/access-control#composer-sa) between\n your environment's service account and the Kubernetes service account of\n your environment's cluster.\n\n- The `gcloud composer environments update` command terminates when the\n operation is finished. You can use the `--async` flag to avoid waiting for\n the operation to complete.\n\nUpdate environments\n-------------------\n\nFor more information about updating your environment, see other documentation\npages about specific update operations. For example:\n\n- [Override Airflow configuration options](/composer/docs/composer-2/override-airflow-configurations)\n- [Set environment variables](/composer/docs/composer-2/set-environment-variables)\n- [Install Python dependencies](/composer/docs/composer-2/install-python-dependencies)\n- [Scale environments](/composer/docs/composer-2/scale-environments)\n\n- [Update environments to high resilience](/composer/docs/composer-2/set-up-highly-resilient-environments)\n\nView environment details\n------------------------\n\n### Console\n\n1. In Google Cloud console, go to the **Environments** page.\n\n [Go to Environments](https://console.cloud.google.com/composer/environments)\n2. In the list of environments, click the name of your environment.\n The **Environment details** page opens.\n\n### gcloud\n\nRun the following `gcloud` command: \n\n gcloud composer environments describe \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e\n\nReplace:\n\n- `ENVIRONMENT_NAME` with the name of the environment.\n- `LOCATION` with the region where the environment is located.\n\n### API\n\nConstruct an [`environments.get`](/composer/docs/reference/rest/v1/projects.locations.environments/get) API request.\n\nExample: \n\n GET https://composer.googleapis.com/v1/projects/example-project/\n locations/us-central1/environments/example-environment\n\n### Terraform\n\nRun the `terraform state show` command for your environment's resource.\n\nThe name of your environment's Terraform resource might be different than the\nname of your environment. \n\n terraform state show google_composer_environment.\u003cvar translate=\"no\"\u003eRESOURCE_NAME\u003c/var\u003e\n\nReplace:\n\n- `RESOURCE_NAME` with the name of your environment's resource.\n\n### Rolling back update changes\n\nIn some rare situations, an update operation might be interrupted\n(for example, because of a timeout) and the requested changes might not be\nrolled back in all environment components (such as the Airflow web server).\n\nFor example, an update operation might be installing or removing additional\nPyPI modules, re-defining or defining a new Airflow or Cloud Composer\nenvironment variable, or changing some Airflow-related parameters.\n\nSuch a situation might occur if an update operation is triggered when other\noperations are in progress, for example Cloud Composer cluster's\nautoscaling or a maintenance operation.\n\nIn such a situation, it's recommended to repeat the operation.\n\n### Duration of update or upgrade operations\n\nThe duration of update and upgrade operations is affected by the following\nfactors:\n\n- Most update or upgrade operations require restarting Airflow components\n like Airflow schedulers, workers and web servers. After a component is\n restarted, it must be initialized. During the initialization, Airflow\n schedulers and workers download the contents of `/dags` and `/plugins`\n folders from the environment's bucket. The process of syncing files to\n Airflow schedulers and workers isn't instantaneous and depends on the total\n size and number of all objects in these folders.\n\n We recommend to keep only DAG and plugin files in `/dags` and `/plugins`\n folders (respectively) and remove all other files. Too much data\n in `/dags` and `/plugins` folders might slow down the initialization of\n Airflow components and in certain cases might make the initialization not\n possible.\n\n We recommend to keep less than 30 MB of data in `/dags` and `/plugins`\n folders, and to definitely not exceed 100 MB size of data. For more\n information, also see\n [Handling large number of DAGs and plugins](/composer/docs/composer-2/troubleshooting-dags#large-number-of-dags)\n- The size of the Airflow database might significantly increase the time of\n upgrade operations. We recommend to maintain the Airflow database size by\n\n [maintaining the Airflow database](/composer/docs/composer-2/cleanup-airflow-database) of your environment.\n\nWhat's next\n-----------\n\n- [Upgrade environments](/composer/docs/composer-2/upgrade-environments)\n- [Override Airflow configuration options](/composer/docs/composer-2/override-airflow-configurations)"]]