WARNING - Executing <Task finished coro=<TriggerRunner.run_trigger() done, defined at /opt/***/***/jobs/my-custom-code.py:609> result=None> took 0.401 second
The triggerer does not appear to be running. Last heartbeat was received
4 hours ago. Triggers will not run, and any deferred operator will remain
deferred until it times out or fails.
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],["最終更新日 2025-02-18 UTC。"],[[["\u003cp\u003eThis document provides troubleshooting steps for common issues with the Airflow triggerer in Cloud Composer 2, such as blocked asynchronous tasks and insufficient triggerer resources.\u003c/p\u003e\n"],["\u003cp\u003eWarning messages in triggerer logs, like "Async thread blocked" or "A specific task blocked the async thread," indicate performance issues and potential problems with custom asynchronous operator code, respectively.\u003c/p\u003e\n"],["\u003cp\u003eSolutions for addressing triggerer problems often involve allocating more resources, reducing the number of concurrent deferred tasks, or increasing the number of triggerers and worker instances in the environment.\u003c/p\u003e\n"],["\u003cp\u003eIf a callback method fails in an Airflow worker after a trigger finishes, the entire task is marked as \u003ccode\u003eFAILED\u003c/code\u003e and re-executed upon retry, and if a custom trigger operator blocks the main triggerer loop, a warning will be generated in the triggerer logs.\u003c/p\u003e\n"],["\u003cp\u003eIf the triggerer is disabled, deferred tasks remain in that state until a timeout is reached, requiring manual intervention to mark them as failed or re-enabling the triggerer to complete them, and if you see a warning message stating that the triggerer does not appear to be running, it is possible that it was disabled before the triggers were completed.\u003c/p\u003e\n"]]],[],null,["# Troubleshooting Airflow triggerer issues\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n[Cloud Composer 3](/composer/docs/composer-3/troubleshooting-triggerer \"View this page for Cloud Composer 3\") \\| **Cloud Composer 2** \\| Cloud Composer 1\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page provides troubleshooting steps and information for common\nissues with the Airflow triggerer.\n\nBlocking operations in trigger\n------------------------------\n\nAsynchronous tasks might occasionally become blocked in triggerers.\nIn most cases, the problems come from insufficient triggerer resources\nor issues with custom asynchronous operator code.\n\n[Triggerer logs](/composer/docs/composer-2/view-logs#streaming) surface any warning messages that can\nhelp you identify root causes of decreased triggerer performance. There are two\nsignificant warnings to look for.\n\n1. Async thread blocked\n\n Triggerer's async thread was blocked for 1.2 seconds, likely due to the highly utilized environment.\n\n This warning signals issues with performance due to a high volume of async tasks.\n\n **Solution** : To address this issue,\n [allocate more resources](/composer/docs/composer-2/scale-environments#workloads-configuration) to the triggerers,\n reduce the number of deferred tasks that are executed at the same time,\n or [increase the number of triggerers in your environment](/composer/docs/composer-2/scale-environments#triggerer-count).\n Keep in mind that even though triggerers handle deferrable tasks, it's\n the workers that are responsible for starting and eventually\n completing each task. If you are adjusting the number of triggerers, consider also [scaling the number of your worker instances](/composer/docs/composer-2/scale-environments#autoscaling-workers).\n2. A specific task blocked the async thread.\n\n WARNING - Executing \u003cTask finished coro=\u003cTriggerRunner.run_trigger() done, defined at /opt/***/***/jobs/my-custom-code.py:609\u003e result=None\u003e took 0.401 second\n\n This warning points to a specific piece of operator code executed by\n Cloud Composer. Triggers by design should rely on the `asyncio` library for\n running operations in the background. A custom implementation of a trigger can\n fail to properly adhere to `asyncio` contracts (for example because of incorrect\n usage of `await` and `async` keywords in Python code).\n\n **Solution**: Inspect the code reported by the warning and check if the\n async operation is properly implemented.\n\nToo many triggers\n-----------------\n\nThe number of deferred tasks is visible in the `task_count` metric which is\nalso displayed on the Monitoring dashboard of your environment. Each trigger\ncreates some resources such as connections to external resources, which consume\nmemory.\n[](/static/composer/docs/images/composer-2-triggerer-running-and-queued-tasks.png) **Figure 1.** Deferred tasks displayed on the Monitoring dashboard (click to enlarge)\n\nGraphs of memory and CPU consumption indicate that insufficient resources cause\nrestarts because the liveness probe fails because of missing heartbeats:\n[](/static/composer/docs/images/composer-2-triggerer-restarts.png) **Figure 2.** Triggerer restarts because of insufficient resources (click to enlarge)\n\n**Solution** : To address this issue,\n[allocate more resources](/composer/docs/composer-2/scale-environments#workloads-configuration) to the triggerers,\nreduce the number of deferred tasks that are executed at the same time,\nor [increase the number of triggerers in your environment](/composer/docs/composer-2/scale-environments#triggerer-count).\n\nCrash of an Airflow worker during the callback execution\n--------------------------------------------------------\n\nAfter the trigger finishes the execution, the control returns to an Airflow\nworker, which runs a callback method using an execution slot. This phase is\ncontrolled by Celery Executor and therefore the corresponding configuration and\nresource limits apply (such as `parallelism` or `worker_concurrency`).\n\nIf the callback method fails in the Airflow worker, the worker fails, or the\nworker that runs the method restarts, then the task is marked as `FAILED`. In\nthis case, the retry operation re-executes the entire task, not only the\ncallback method.\n\nInfinite loop in a trigger\n--------------------------\n\nIt is possible to implement a custom trigger operator in such a way that it\nentirely blocks the main triggerer loop, so that only the one broken trigger is\nexecuted at the time. In this case, a warning\n[is generated in the triggerer logs](/composer/docs/composer-2/view-logs#streaming) after the\nproblematic trigger is finished.\n\nTrigger class not found\n-----------------------\n\nBecause the DAGs folder is not synchronized with the Airflow triggerer, the\ninlined trigger code is missing when the trigger is executed. The error is\ngenerated in the logs of the failed task: \n\n ImportError: Module \"PACKAGE_NAME\" does not define a \"CLASS_NAME\" attribute/\n class\n\n**Solution** : [Import the missing code from PyPI](/composer/docs/composer-2/install-python-dependencies).\n\nWarning message about the triggerer in Airflow UI\n-------------------------------------------------\n\nIn some cases after the triggerer is disabled, you might see the following\nwarning message in Airflow UI: \n\n The triggerer does not appear to be running. Last heartbeat was received\n 4 hours ago. Triggers will not run, and any deferred operator will remain\n deferred until it times out or fails.\n\nAirflow can show this message because incomplete triggers remain in the Airflow\ndatabase. This message usually means that the triggerer was disabled before all\ntriggers were completed in your environment.\n\nYou can view all triggers that are running in the environment by checking the\n**Browse** \\\u003e **Triggers** page in Airflow UI (the `Admin` role is\nrequired).\n\n**Solutions**:\n\n- [Enable the triggerer](/composer/docs/composer-2/scale-environments#triggerer-count) again and wait for deferred tasks to complete.\n- [Access the Airflow database](/composer/docs/composer-2/access-airflow-database) and delete incomplete triggers manually.\n\nTasks remain in the deferred state after the triggerer is disabled\n------------------------------------------------------------------\n\nWhen the triggerer is disabled, tasks that are already in the deferred state\nremain in this state until the timeout is reached. This timeout can be\ninfinite, depending on the Airflow and DAG configuration.\n\nUse one of the following solutions:\n\n- Manually mark the tasks as failed.\n- Enable the triggerer to complete the tasks.\n\nWe recommend to disable the triggerer only if your environment does not run any\ndeferred operators or tasks, and all deferred tasks are completed.\n\nWhat's next\n-----------\n\n- [Airflow triggerer metrics](/composer/docs/composer-2/monitor-environments#airflow-metrics)\n- [Airflow triggerer logs](/composer/docs/composer-2/view-logs#streaming)"]]