이 페이지에서는 Airflow 트리거의 일반적인 문제에 대한 문제 해결 단계 및 정보를 제공합니다.
트리거의 작업 차단
비동기 태스크가 트리거에서 차단되는 경우가 있습니다.
대부분의 경우 이 문제는 트리거 리소스가 충분하지 않거나 커스텀 비동기 연산자 코드에 문제가 있기 때문에 발생합니다.
트리거 로그는 트리거 성능 저하의 근본 원인을 파악하는 데 도움이 되는 경고 메시지를 표시합니다. 확인해야 할 두 가지 중요한 경고가 있습니다.
비동기 스레드가 차단되었습니다.
Triggerer's async thread was blocked for 1.2 seconds, likely due to the highly utilized environment.
이 경고는 비동기 태스크가 많아 성능 문제가 있음을 나타냅니다.
해결책: 이 문제를 해결하려면 트리거에 리소스를 더 할당하고 동시에 실행되는 지연된 태스크 수를 줄이거나 사용자 환경에서 트리거 수를 늘립니다.
트리거가 지연 가능한 태스크를 처리하더라도 각 태스크를 시작하고 최종적으로 완료하는 것은 작업자의 책임입니다. 트리거 수를 조정할 경우 작업자 인스턴스 수를 확장하는 것도 좋습니다.
특정 태스크가 비동기 스레드를 차단했습니다.
WARNING - Executing <Task finished coro=<TriggerRunner.run_trigger() done, defined at /opt/***/***/jobs/my-custom-code.py:609> result=None> took 0.401 second
이 경고는 Cloud Composer에서 실행되는 특정 연산자 코드를 가리킵니다. 기본적으로 트리거는 백그라운드에서 작업을 실행하기 위해 asyncio 라이브러리를 사용해야 합니다. Python 코드의 await 및 async 키워드가 잘못 사용되는 등 트리거의 커스텀 구현이 asyncio 계약을 제대로 준수하지 못할 수 있습니다.
솔루션: 경고로 보고된 코드를 검사하고 비동기 작업이 올바르게 구현되었는지 확인합니다.
너무 많은 트리거
지연된 태스크 수는 환경의 Monitoring 대시보드에도 표시되는 task_count 측정항목에 표시됩니다. 각 트리거는 메모리를 사용하는 외부 리소스 연결과 같은 일부 리소스를 만듭니다.
그림 1. Monitoring 대시보드에 표시된 지연된 작업(확대하려면 클릭)
메모리와 CPU 소비의 그래프는 하트비트 누락으로 인해 활성 프로브가 실패하기 때문에 리소스 부족으로 인해 다시 시작됨을 나타냅니다.
트리거가 실행을 완료하면 컨트롤이 실행 슬롯을 사용하여 콜백 메서드를 실행하는 Airflow 작업자로 돌아갑니다. 이 단계는 Celery Executor에 의해 제어되므로 해당 구성 및 리소스 한도(예: parallelism 또는 worker_concurrency)가 적용됩니다.
Airflow 작업자에서 콜백 메서드가 실패하거나 작업자가 실패하거나 메서드를 실행하는 작업자가 다시 시작되면 태스크가 FAILED로 표시됩니다. 이 경우 재시도 작업은 콜백 메서드뿐만 아니라 전체 태스크를 다시 실행합니다.
트리거의 무한 루프
기본 트리거 루프를 완전히 차단하여 한 번에 하나의 손상된 트리거만 실행되도록 커스텀 트리거 연산자를 구현할 수 있습니다. 이 경우 문제가 있는 트리거가 완료되면 경고가 트리거 로그에 생성됩니다.
트리거 클래스를 찾을 수 없음
DAG 폴더는 Airflow 트리거와 동기화되지 않으므로 트리거가 실행될 때 인라인 트리거 코드가 누락됩니다. 이 오류는 실패한 태스크의 로그에 생성됩니다.
ImportError: Module "PACKAGE_NAME" does not define a "CLASS_NAME" attribute/
class
경우에 따라 트리거가 사용 중지된 후 Airflow UI에 다음과 같은 경고 메시지가 표시될 수 있습니다.
The triggerer does not appear to be running. Last heartbeat was received
4 hours ago. Triggers will not run, and any deferred operator will remain
deferred until it times out or fails.
불완전한 트리거가 Airflow 데이터베이스에 남아 있으므로 Airflow는 이 메시지를 표시할 수 있습니다. 이 메시지는 일반적으로 환경에서 모든 트리거가 완료되기 전에 트리거가 사용 중지되었음을 의미합니다.
환경에서 실행 중인 모든 트리거를 보려면 Airflow UI에서 찾아보기>트리거 페이지를 확인합니다(Admin 역할 필요).
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-08-26(UTC)"],[[["\u003cp\u003eThis page focuses on troubleshooting common issues with the Airflow triggerer in Cloud Composer, which may occur due to insufficient resources or problems with custom asynchronous operator code.\u003c/p\u003e\n"],["\u003cp\u003eBlocked asynchronous tasks can be caused by an overutilization of the environment or by a specific task improperly implementing \u003ccode\u003easyncio\u003c/code\u003e, and the solutions involve allocating more triggerer resources, reducing deferred tasks, or increasing the number of triggerers.\u003c/p\u003e\n"],["\u003cp\u003eIf a callback method fails within an Airflow worker, it will cause the entire task to fail and retry, not just the callback method itself.\u003c/p\u003e\n"],["\u003cp\u003eMissing code in the DAGs folder and warning messages in Airflow UI can be addressed by importing code from PyPI, enabling the triggerer, or accessing and editing the Airflow database.\u003c/p\u003e\n"],["\u003cp\u003eTasks may remain deferred if the triggerer is disabled, which can be resolved by manually marking the tasks as failed or by re-enabling the triggerer to allow for completion.\u003c/p\u003e\n"]]],[],null,["# Troubleshooting Airflow triggerer issues\n\n**Cloud Composer 3** \\| [Cloud Composer 2](/composer/docs/composer-2/troubleshooting-triggerer \"View this page for Cloud Composer 2\") \\| Cloud Composer 1\n\n\u003cbr /\u003e\n\n| **Note:** This page is **not yet revised for Cloud Composer 3** and displays content for Cloud Composer 2.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page provides troubleshooting steps and information for common\nissues with the Airflow triggerer.\n\nBlocking operations in trigger\n------------------------------\n\nAsynchronous tasks might occasionally become blocked in triggerers.\nIn most cases, the problems come from insufficient triggerer resources\nor issues with custom asynchronous operator code.\n\n[Triggerer logs](/composer/docs/composer-2/view-logs#streaming) surface any warning messages that can\nhelp you identify root causes of decreased triggerer performance. There are two\nsignificant warnings to look for.\n\n1. Async thread blocked\n\n Triggerer's async thread was blocked for 1.2 seconds, likely due to the highly utilized environment.\n\n This warning signals issues with performance due to a high volume of async tasks.\n\n **Solution** : To address this issue,\n [allocate more resources](/composer/docs/composer-2/scale-environments#workloads-configuration) to the triggerers,\n reduce the number of deferred tasks that are executed at the same time,\n or [increase the number of triggerers in your environment](/composer/docs/composer-2/scale-environments#triggerer-count).\n Keep in mind that even though triggerers handle deferrable tasks, it's\n the workers that are responsible for starting and eventually\n completing each task. If you are adjusting the number of triggerers, consider also [scaling the number of your worker instances](/composer/docs/composer-2/scale-environments#autoscaling-workers).\n2. A specific task blocked the async thread.\n\n WARNING - Executing \u003cTask finished coro=\u003cTriggerRunner.run_trigger() done, defined at /opt/***/***/jobs/my-custom-code.py:609\u003e result=None\u003e took 0.401 second\n\n This warning points to a specific piece of operator code executed by\n Cloud Composer. Triggers by design should rely on the `asyncio` library for\n running operations in the background. A custom implementation of a trigger can\n fail to properly adhere to `asyncio` contracts (for example because of incorrect\n usage of `await` and `async` keywords in Python code).\n\n **Solution**: Inspect the code reported by the warning and check if the\n async operation is properly implemented.\n\nToo many triggers\n-----------------\n\nThe number of deferred tasks is visible in the `task_count` metric which is\nalso displayed on the Monitoring dashboard of your environment. Each trigger\ncreates some resources such as connections to external resources, which consume\nmemory.\n[](/static/composer/docs/images/composer-2-triggerer-running-and-queued-tasks.png) **Figure 1.** Deferred tasks displayed on the Monitoring dashboard (click to enlarge)\n\nGraphs of memory and CPU consumption indicate that insufficient resources cause\nrestarts because the liveness probe fails because of missing heartbeats:\n[](/static/composer/docs/images/composer-2-triggerer-restarts.png) **Figure 2.** Triggerer restarts because of insufficient resources (click to enlarge)\n\n**Solution** : To address this issue,\n[allocate more resources](/composer/docs/composer-2/scale-environments#workloads-configuration) to the triggerers,\nreduce the number of deferred tasks that are executed at the same time,\nor [increase the number of triggerers in your environment](/composer/docs/composer-2/scale-environments#triggerer-count).\n\nCrash of an Airflow worker during the callback execution\n--------------------------------------------------------\n\nAfter the trigger finishes the execution, the control returns to an Airflow\nworker, which runs a callback method using an execution slot. This phase is\ncontrolled by Celery Executor and therefore the corresponding configuration and\nresource limits apply (such as `parallelism` or `worker_concurrency`).\n\nIf the callback method fails in the Airflow worker, the worker fails, or the\nworker that runs the method restarts, then the task is marked as `FAILED`. In\nthis case, the retry operation re-executes the entire task, not only the\ncallback method.\n\nInfinite loop in a trigger\n--------------------------\n\nIt is possible to implement a custom trigger operator in such a way that it\nentirely blocks the main triggerer loop, so that only the one broken trigger is\nexecuted at the time. In this case, a warning\n[is generated in the triggerer logs](/composer/docs/composer-2/view-logs#streaming) after the\nproblematic trigger is finished.\n\nTrigger class not found\n-----------------------\n\nBecause the DAGs folder is not synchronized with the Airflow triggerer, the\ninlined trigger code is missing when the trigger is executed. The error is\ngenerated in the logs of the failed task: \n\n ImportError: Module \"PACKAGE_NAME\" does not define a \"CLASS_NAME\" attribute/\n class\n\n**Solution** : [Import the missing code from PyPI](/composer/docs/composer-2/install-python-dependencies).\n\nWarning message about the triggerer in Airflow UI\n-------------------------------------------------\n\nIn some cases after the triggerer is disabled, you might see the following\nwarning message in Airflow UI: \n\n The triggerer does not appear to be running. Last heartbeat was received\n 4 hours ago. Triggers will not run, and any deferred operator will remain\n deferred until it times out or fails.\n\nAirflow can show this message because incomplete triggers remain in the Airflow\ndatabase. This message usually means that the triggerer was disabled before all\ntriggers were completed in your environment.\n\nYou can view all triggers that are running in the environment by checking the\n**Browse** \\\u003e **Triggers** page in Airflow UI (the `Admin` role is\nrequired).\n\n**Solutions**:\n\n- [Enable the triggerer](/composer/docs/composer-2/scale-environments#triggerer-count) again and wait for deferred tasks to complete.\n- [Access the Airflow database](/composer/docs/composer-2/access-airflow-database) and delete incomplete triggers manually.\n\nTasks remain in the deferred state after the triggerer is disabled\n------------------------------------------------------------------\n\nWhen the triggerer is disabled, tasks that are already in the deferred state\nremain in this state until the timeout is reached. This timeout can be\ninfinite, depending on the Airflow and DAG configuration.\n\nUse one of the following solutions:\n\n- Manually mark the tasks as failed.\n- Enable the triggerer to complete the tasks.\n\nWe recommend to disable the triggerer only if your environment does not run any\ndeferred operators or tasks, and all deferred tasks are completed.\n\nWhat's next\n-----------\n\n- [Airflow triggerer metrics](/composer/docs/composer-2/monitor-environments#airflow-metrics)\n- [Airflow triggerer logs](/composer/docs/composer-2/view-logs#streaming)"]]