Cloud Composer 환경이 업데이트될 때 대부분의 문제는 다음과 같은 이유로 발생합니다.
서비스 계정 권한 문제
PyPI 종속 항목 문제
Airflow 데이터베이스 크기
환경 업데이트 또는 업그레이드 권한 부족
권한 부족으로 인해 Cloud Composer가 환경을 업데이트하거나 업그레이드할 수 없으면 다음 오류 메시지가 출력됩니다.
ERROR: (gcloud.composer.environments.update) PERMISSION_DENIED: The caller does not have permission
해결책: 액세스 제어에 설명된 대로 사용자 계정과 사용자 환경의 서비스 계정 모두에 역할을 할당합니다.
환경의 서비스 계정에 권한이 충분하지 않음
Cloud Composer 환경을 만들 때 환경의 대부분의 작업을 실행하는 서비스 계정을 지정합니다. 이 서비스 계정에 요청된 작업에 대한 권한이 충분하지 않은 경우 Cloud Composer에서 다음 오류를 출력합니다.
UPDATE operation on this environment failed 3 minutes ago with the
following error message:
Composer Backend timed out. Currently running tasks are [stage:
CP_COMPOSER_AGENT_RUNNING
description: "No agent response published."
response_timestamp {
seconds: 1618203503
nanos: 291000000
}
].
해결 방법: 액세스 제어에 설명된 대로 Google 계정과 사용자 환경의 서비스 계정 모두에 역할을 할당합니다.
Airflow 데이터베이스 크기가 너무 커서 작업을 수행할 수 없음
Airflow 데이터베이스의 크기가 너무 커서 업그레이드 작업이 성공할 수 없기 때문에 업그레이드 작업이 성공하지 못할 수 있습니다.
Airflow 데이터베이스 크기가 20GB보다 크면 Cloud Composer가 다음 오류를 출력합니다.
Airflow database uses more than 20 GB. Please clean the database before upgrading.
커스텀 PyPI 패키지가 설치된 환경을 업그레이드할 때 PyPI 패키지 충돌과 관련된 오류가 발생할 수 있습니다. 이는 새 Airflow 빌드에 사전 설치된 패키지의 최신 버전이 포함되어 있기 때문일 수 있습니다. 이로 인해 환경에 설치한 PyPI 패키지와 종속 항목이 충돌할 수 있습니다.
설치된 커스텀 PyPI 패키지의 버전 제약조건을 완화합니다. 예를 들어 버전을 ==1.0.1로 지정하는 대신 >=1.0.1로 지정합니다.
충돌 종속 항목을 해결하기 위한 버전 변경 요구사항에 대한 자세한 내용은 pip 문서를 참조하세요.
이전 실패 경고 검사
Airflow를 이후 버전으로 업그레이드할 때 Airflow 데이터베이스에 새로운 제약조건이 적용되는 경우가 있습니다. 이러한 제약 조건을 적용할 수 없는 경우 Airflow는 해당 제약 조건을 적용할 수 없는 행을 저장할 새 테이블을 만듭니다. 이동한 데이터 테이블의 이름이 변경되거나 삭제될 때까지 Airflow UI에 경고 메시지가 표시됩니다.
해결책:
다음 두 DAG를 사용하여 이동한 데이터를 검사하고 테이블의 이름을 바꿀 수 있습니다.
list_moved_tables_after_upgrade_dag DAG는 제약조건을 적용할 수 없는 모든 테이블에서 이동된 행을 나열합니다. 데이터를 검사하여 계속 보관할지 여부를 결정합니다. 계속 보관하려면 Airflow 데이터베이스의 데이터를 수동으로 수정해야 합니다. 예를 들어 올바른 데이터가 포함된 행을 다시 추가합니다.
데이터가 필요하지 않거나 이미 수정한 경우 rename_moved_tables_after_upgrade_dag DAG를 실행할 수 있습니다. 이 DAG는 이동한 테이블의 이름을 바꿉니다.
테이블과 해당 데이터는 삭제되지 않으므로 나중에 데이터를 검토할 수 있습니다.
"""When upgrading Airflow to a newer version,it might happen that some data cannot be migrated,often because of constraint changes in the metadata base.This file contains 2 DAGs:1. 'list_moved_tables_after_upgrade_dag' Prints the rows which failed to be migrated.2. 'rename_moved_tables_after_upgrade_dag' Renames the table which contains the failed migrations. This will remove the warning message from airflow."""importdatetimeimportloggingfromairflowimportDAGfromairflow.operators.pythonimportPythonOperatorfromairflow.providers.postgres.hooks.postgresimportPostgresHookfromairflow.settingsimportAIRFLOW_MOVED_TABLE_PREFIXdefget_moved_tables():hook=PostgresHook(postgres_conn_id="airflow_db")returnhook.get_records("SELECT schemaname, tablename FROM pg_catalog.pg_tables WHERE tablename"f" LIKE '{AIRFLOW_MOVED_TABLE_PREFIX}_%'")deflist_moved_records():tables=get_moved_tables()ifnottables:logging.info("No moved tables found")returnhook=PostgresHook(postgres_conn_id="airflow_db")forschema,tableintables:df=hook.get_pandas_df(f"SELECT * FROM {schema}.{table}")logging.info(df.to_markdown())defrename_moved_tables():tables=get_moved_tables()ifnottables:returnhook=PostgresHook(postgres_conn_id="airflow_db")forschema,tableintables:hook.run(f"ALTER TABLE {schema}.{table} RENAME TO _abandoned_{table}")withDAG(dag_id="list_moved_tables_after_upgrade_dag",start_date=datetime.datetime(2023,1,1),schedule_interval=None,catchup=False,):t1=PythonOperator(task_id="list_moved_records",python_callable=list_moved_records)withDAG(dag_id="rename_moved_tables_after_upgrade_dag",start_date=datetime.datetime(2023,1,1),schedule_interval=None,catchup=False,)asdag:t1=PythonOperator(task_id="rename_moved_tables",python_callable=rename_moved_tables)
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-08-26(UTC)"],[[["\u003cp\u003eThis page addresses common issues encountered during Cloud Composer environment updates or upgrades, including service account permissions, PyPI dependencies, and Airflow database size.\u003c/p\u003e\n"],["\u003cp\u003eInsufficient permissions, either for your account or the environment's service account, can halt updates or upgrades, requiring the assignment of roles as outlined in the Access Control documentation.\u003c/p\u003e\n"],["\u003cp\u003eUpgrades might fail if the Airflow database exceeds 20 GB, necessitating a database cleanup as described in the Clean Up the Airflow Database documentation.\u003c/p\u003e\n"],["\u003cp\u003eConflicts with custom PyPI packages during upgrades can occur due to version mismatches, which can be resolved by loosening version constraints or using the upgrade check tool.\u003c/p\u003e\n"],["\u003cp\u003eAirflow upgrades may result in data migration failures; two provided DAGs allow inspecting and renaming the tables containing this unmigrated data, addressing warning messages in the Airflow UI.\u003c/p\u003e\n"]]],[],null,["\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n**Cloud Composer 3** \\| [Cloud Composer 2](/composer/docs/composer-2/troubleshooting-updates-upgrades \"View this page for Cloud Composer 2\") \\| [Cloud Composer 1](/composer/docs/composer-1/troubleshooting-updates-upgrades \"View this page for Cloud Composer 1\")\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page provides troubleshooting information for problems that you might\nencounter while updating or upgrading Cloud Composer environments.\n\nFor troubleshooting information related to creating environments, see\n[Troubleshooting environment creation](/composer/docs/composer-3/troubleshooting-environment-creation).\n\nWhen Cloud Composer environments are updated, the majority of issues\nhappen because of the following reasons:\n\n- Service account permission problems\n- PyPI dependency issues\n- Size of the Airflow database\n\nInsufficient permissions to update or upgrade an environment\n\nIf Cloud Composer can't update or upgrade an environment because of\ninsufficient permissions, it outputs the following error message: \n\n ERROR: (gcloud.composer.environments.update) PERMISSION_DENIED: The caller does not have permission\n\n**Solution** : Assign roles to both to your account and to the service account\nof your environment as described in [Access control](/composer/docs/composer-3/access-control).\n\nThe service account of the environment has insufficient permissions\n\nWhen creating a Cloud Composer environment, you specify a service\naccount that performs most of the environment's operations. If this\nservice account doesn't have enough permissions for the requested operation,\nthen Cloud Composer outputs an error: \n\n UPDATE operation on this environment failed 3 minutes ago with the\n following error message:\n Composer Backend timed out. Currently running tasks are [stage:\n CP_COMPOSER_AGENT_RUNNING\n description: \"No agent response published.\"\n response_timestamp {\n seconds: 1618203503\n nanos: 291000000\n }\n ].\n\n**Solution** : Assign roles to your Google Account and to the service account of\nyour environment as described in [Access control](/composer/docs/composer-3/access-control).\n\nThe size of the Airflow database is too big to perform the operation\n\nAn upgrade operation might not succeed because the size of the Airflow database\nis too large for upgrade operations to succeed.\n\nIf the size of the Airflow database is more than\n20 GB,\nCloud Composer outputs the following error: \n\n Airflow database uses more than 20 GB. Please clean the database before upgrading.\n\n**Solution** : Perform the Airflow database cleanup, as described in\n[Clean up the Airflow database](/composer/docs/composer-3/cleanup-airflow-database).\n\nAn upgrade to a new Cloud Composer version fails because of PyPI package conflicts\n\nWhen you upgrade an environment with\n[installed custom PyPI packages](/composer/docs/composer-3/install-python-dependencies), you might encounter\nerrors related to PyPI package conflicts. This might happen because the new\n\nAirflow build\n\ncontains later versions of preinstalled packages. This can cause dependency\nconflicts with PyPI packages that you installed in your environment.\n\n**Solution**:\n\n- To get detailed information about package conflicts, run an [upgrade check](/composer/docs/composer-3/upgrade-environments#upgrade-check).\n- Loosen version constraints for installed custom PyPI packages. For example, instead of specifying a version as `==1.0.1`, specify it as `\u003e=1.0.1`.\n- For more information about changing version requirements to resolve conflicting dependencies, see [pip documentation](https://pip.pypa.io/en/stable/topics/dependency-resolution).\n\nInspect failed migration warnings\n\nWhen upgrading Airflow to a later version, sometimes new constraints are\napplied to the Airflow database. If these constraints can't be applied,\nAirflow creates new tables to store the rows for which the constraints couldn't\nbe applied. Airflow UI displays a warning message until the moved data tables\nare renamed or dropped.\n\n**Solution**:\n\nYou can use the following two DAGs to inspect the moved data and rename the\ntables.\n\nThe `list_moved_tables_after_upgrade_dag` DAG lists rows that were moved from\nevery table where constraints could not be applied. Inspect the data and decide\nwhether you want to keep it. To keep it, you need to manually fix the data in\nthe Airflow database. For example, by adding the rows back with the correct data.\n\nIf you don't need the data or if you already fixed it, then you can run the\n`rename_moved_tables_after_upgrade_dag` DAG. This DAG renames the moved tables.\nThe tables and their data are not deleted, so you can review the data at a\nlater point. \n\n \"\"\"\n When upgrading Airflow to a newer version,\n it might happen that some data cannot be migrated,\n often because of constraint changes in the metadata base.\n This file contains 2 DAGs:\n\n 1. 'list_moved_tables_after_upgrade_dag'\n Prints the rows which failed to be migrated.\n 2. 'rename_moved_tables_after_upgrade_dag'\n Renames the table which contains the failed migrations. This will remove the\n warning message from airflow.\n \"\"\"\n\n import datetime\n import logging\n\n from airflow import DAG\n from airflow.operators.python import PythonOperator\n from airflow.providers.postgres.hooks.postgres import PostgresHook\n from airflow.settings import AIRFLOW_MOVED_TABLE_PREFIX\n\n\n def get_moved_tables():\n hook = PostgresHook(postgres_conn_id=\"airflow_db\")\n return hook.get_records(\n \"SELECT schemaname, tablename FROM pg_catalog.pg_tables WHERE tablename\"\n f\" LIKE '{AIRFLOW_MOVED_TABLE_PREFIX}_%'\"\n )\n\n\n def list_moved_records():\n tables = get_moved_tables()\n if not tables:\n logging.info(\"No moved tables found\")\n return\n\n hook = PostgresHook(postgres_conn_id=\"airflow_db\")\n for schema, table in tables:\n df = hook.get_pandas_df(f\"SELECT * FROM {schema}.{table}\")\n logging.info(df.to_markdown())\n\n\n def rename_moved_tables():\n tables = get_moved_tables()\n if not tables:\n return\n\n hook = PostgresHook(postgres_conn_id=\"airflow_db\")\n for schema, table in tables:\n hook.run(f\"ALTER TABLE {schema}.{table} RENAME TO _abandoned_{table}\")\n\n\n with DAG(\n dag_id=\"list_moved_tables_after_upgrade_dag\",\n start_date=datetime.datetime(2023, 1, 1),\n schedule_interval=None,\n catchup=False,\n ):\n t1 = PythonOperator(\n task_id=\"list_moved_records\", python_callable=list_moved_records\n )\n\n with DAG(\n dag_id=\"rename_moved_tables_after_upgrade_dag\",\n start_date=datetime.datetime(2023, 1, 1),\n schedule_interval=None,\n catchup=False,\n ) as dag:\n t1 = PythonOperator(\n task_id=\"rename_moved_tables\", python_callable=rename_moved_tables\n )\n\nWhat's next\n\n- [Updating environments](/composer/docs/composer-3/update-environments)\n- [Upgrading environments](/composer/docs/composer-3/upgrade-environments)\n- [Troubleshooting environment creation](/composer/docs/composer-3/troubleshooting-environment-creation)"]]