WARNING - Executing <Task finished coro=<TriggerRunner.run_trigger() done, defined at /opt/***/***/jobs/my-custom-code.py:609> result=None> took 0.401 second
The triggerer does not appear to be running. Last heartbeat was received
4 hours ago. Triggers will not run, and any deferred operator will remain
deferred until it times out or fails.
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-26。"],[[["\u003cp\u003eThis page focuses on troubleshooting common issues with the Airflow triggerer in Cloud Composer, which may occur due to insufficient resources or problems with custom asynchronous operator code.\u003c/p\u003e\n"],["\u003cp\u003eBlocked asynchronous tasks can be caused by an overutilization of the environment or by a specific task improperly implementing \u003ccode\u003easyncio\u003c/code\u003e, and the solutions involve allocating more triggerer resources, reducing deferred tasks, or increasing the number of triggerers.\u003c/p\u003e\n"],["\u003cp\u003eIf a callback method fails within an Airflow worker, it will cause the entire task to fail and retry, not just the callback method itself.\u003c/p\u003e\n"],["\u003cp\u003eMissing code in the DAGs folder and warning messages in Airflow UI can be addressed by importing code from PyPI, enabling the triggerer, or accessing and editing the Airflow database.\u003c/p\u003e\n"],["\u003cp\u003eTasks may remain deferred if the triggerer is disabled, which can be resolved by manually marking the tasks as failed or by re-enabling the triggerer to allow for completion.\u003c/p\u003e\n"]]],[],null,["**Cloud Composer 3** \\| [Cloud Composer 2](/composer/docs/composer-2/troubleshooting-triggerer \"View this page for Cloud Composer 2\") \\| Cloud Composer 1\n\n\u003cbr /\u003e\n\n| **Note:** This page is **not yet revised for Cloud Composer 3** and displays content for Cloud Composer 2.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page provides troubleshooting steps and information for common\nissues with the Airflow triggerer.\n\nBlocking operations in trigger\n\nAsynchronous tasks might occasionally become blocked in triggerers.\nIn most cases, the problems come from insufficient triggerer resources\nor issues with custom asynchronous operator code.\n\n[Triggerer logs](/composer/docs/composer-2/view-logs#streaming) surface any warning messages that can\nhelp you identify root causes of decreased triggerer performance. There are two\nsignificant warnings to look for.\n\n1. Async thread blocked\n\n Triggerer's async thread was blocked for 1.2 seconds, likely due to the highly utilized environment.\n\n This warning signals issues with performance due to a high volume of async tasks.\n\n **Solution** : To address this issue,\n [allocate more resources](/composer/docs/composer-2/scale-environments#workloads-configuration) to the triggerers,\n reduce the number of deferred tasks that are executed at the same time,\n or [increase the number of triggerers in your environment](/composer/docs/composer-2/scale-environments#triggerer-count).\n Keep in mind that even though triggerers handle deferrable tasks, it's\n the workers that are responsible for starting and eventually\n completing each task. If you are adjusting the number of triggerers, consider also [scaling the number of your worker instances](/composer/docs/composer-2/scale-environments#autoscaling-workers).\n2. A specific task blocked the async thread.\n\n WARNING - Executing \u003cTask finished coro=\u003cTriggerRunner.run_trigger() done, defined at /opt/***/***/jobs/my-custom-code.py:609\u003e result=None\u003e took 0.401 second\n\n This warning points to a specific piece of operator code executed by\n Cloud Composer. Triggers by design should rely on the `asyncio` library for\n running operations in the background. A custom implementation of a trigger can\n fail to properly adhere to `asyncio` contracts (for example because of incorrect\n usage of `await` and `async` keywords in Python code).\n\n **Solution**: Inspect the code reported by the warning and check if the\n async operation is properly implemented.\n\nToo many triggers\n\nThe number of deferred tasks is visible in the `task_count` metric which is\nalso displayed on the Monitoring dashboard of your environment. Each trigger\ncreates some resources such as connections to external resources, which consume\nmemory.\n[](/static/composer/docs/images/composer-2-triggerer-running-and-queued-tasks.png) **Figure 1.** Deferred tasks displayed on the Monitoring dashboard (click to enlarge)\n\nGraphs of memory and CPU consumption indicate that insufficient resources cause\nrestarts because the liveness probe fails because of missing heartbeats:\n[](/static/composer/docs/images/composer-2-triggerer-restarts.png) **Figure 2.** Triggerer restarts because of insufficient resources (click to enlarge)\n\n**Solution** : To address this issue,\n[allocate more resources](/composer/docs/composer-2/scale-environments#workloads-configuration) to the triggerers,\nreduce the number of deferred tasks that are executed at the same time,\nor [increase the number of triggerers in your environment](/composer/docs/composer-2/scale-environments#triggerer-count).\n\nCrash of an Airflow worker during the callback execution\n\nAfter the trigger finishes the execution, the control returns to an Airflow\nworker, which runs a callback method using an execution slot. This phase is\ncontrolled by Celery Executor and therefore the corresponding configuration and\nresource limits apply (such as `parallelism` or `worker_concurrency`).\n\nIf the callback method fails in the Airflow worker, the worker fails, or the\nworker that runs the method restarts, then the task is marked as `FAILED`. In\nthis case, the retry operation re-executes the entire task, not only the\ncallback method.\n\nInfinite loop in a trigger\n\nIt is possible to implement a custom trigger operator in such a way that it\nentirely blocks the main triggerer loop, so that only the one broken trigger is\nexecuted at the time. In this case, a warning\n[is generated in the triggerer logs](/composer/docs/composer-2/view-logs#streaming) after the\nproblematic trigger is finished.\n\nTrigger class not found\n\nBecause the DAGs folder is not synchronized with the Airflow triggerer, the\ninlined trigger code is missing when the trigger is executed. The error is\ngenerated in the logs of the failed task: \n\n ImportError: Module \"PACKAGE_NAME\" does not define a \"CLASS_NAME\" attribute/\n class\n\n**Solution** : [Import the missing code from PyPI](/composer/docs/composer-2/install-python-dependencies).\n\nWarning message about the triggerer in Airflow UI\n\nIn some cases after the triggerer is disabled, you might see the following\nwarning message in Airflow UI: \n\n The triggerer does not appear to be running. Last heartbeat was received\n 4 hours ago. Triggers will not run, and any deferred operator will remain\n deferred until it times out or fails.\n\nAirflow can show this message because incomplete triggers remain in the Airflow\ndatabase. This message usually means that the triggerer was disabled before all\ntriggers were completed in your environment.\n\nYou can view all triggers that are running in the environment by checking the\n**Browse** \\\u003e **Triggers** page in Airflow UI (the `Admin` role is\nrequired).\n\n**Solutions**:\n\n- [Enable the triggerer](/composer/docs/composer-2/scale-environments#triggerer-count) again and wait for deferred tasks to complete.\n- [Access the Airflow database](/composer/docs/composer-2/access-airflow-database) and delete incomplete triggers manually.\n\nTasks remain in the deferred state after the triggerer is disabled\n\nWhen the triggerer is disabled, tasks that are already in the deferred state\nremain in this state until the timeout is reached. This timeout can be\ninfinite, depending on the Airflow and DAG configuration.\n\nUse one of the following solutions:\n\n- Manually mark the tasks as failed.\n- Enable the triggerer to complete the tasks.\n\nWe recommend to disable the triggerer only if your environment does not run any\ndeferred operators or tasks, and all deferred tasks are completed.\n\nWhat's next\n\n- [Airflow triggerer metrics](/composer/docs/composer-2/monitor-environments#airflow-metrics)\n- [Airflow triggerer logs](/composer/docs/composer-2/view-logs#streaming)"]]